Feb 26 15:42:16 crc systemd[1]: Starting Kubernetes Kubelet... Feb 26 15:42:16 crc restorecon[4595]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:16 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 26 15:42:17 crc restorecon[4595]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 26 15:42:17 crc kubenswrapper[4907]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:42:17 crc kubenswrapper[4907]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 26 15:42:17 crc kubenswrapper[4907]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:42:17 crc kubenswrapper[4907]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:42:17 crc kubenswrapper[4907]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 26 15:42:17 crc kubenswrapper[4907]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.894124 4907 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899459 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899481 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899489 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899495 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899501 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899507 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899512 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899517 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899522 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899527 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899532 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899537 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899541 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899547 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899558 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899564 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899568 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899573 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899578 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899582 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899604 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899609 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899614 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899618 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899623 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899627 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899633 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899638 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899642 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899646 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899650 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899655 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899659 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899663 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899668 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899672 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899676 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899681 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899685 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899690 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899694 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899699 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899705 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899713 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899718 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899723 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899728 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899732 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899737 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899742 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899746 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899751 4907 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899755 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899760 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899764 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899768 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899773 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899777 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899781 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899785 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899790 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899794 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899798 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899802 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899807 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899811 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899815 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899821 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899825 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899829 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.899834 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900518 4907 flags.go:64] FLAG: --address="0.0.0.0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900534 4907 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900544 4907 flags.go:64] FLAG: --anonymous-auth="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900551 4907 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900558 4907 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900564 4907 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900571 4907 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900578 4907 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900602 4907 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900609 4907 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900615 4907 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900620 4907 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900625 4907 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900632 4907 flags.go:64] FLAG: --cgroup-root="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900637 4907 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900642 4907 flags.go:64] FLAG: --client-ca-file="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900647 4907 flags.go:64] FLAG: --cloud-config="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900652 4907 flags.go:64] FLAG: --cloud-provider="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.900657 4907 flags.go:64] FLAG: --cluster-dns="[]" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901038 4907 flags.go:64] FLAG: --cluster-domain="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901046 4907 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901052 4907 flags.go:64] FLAG: --config-dir="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901057 4907 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901063 4907 flags.go:64] FLAG: --container-log-max-files="5" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901071 4907 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901077 4907 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901083 4907 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901088 4907 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901094 4907 flags.go:64] FLAG: --contention-profiling="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901099 4907 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901104 4907 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901110 4907 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901116 4907 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901123 4907 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901128 4907 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901134 4907 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901139 4907 flags.go:64] FLAG: --enable-load-reader="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901145 4907 flags.go:64] FLAG: --enable-server="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901151 4907 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901158 4907 flags.go:64] FLAG: --event-burst="100" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901163 4907 flags.go:64] FLAG: --event-qps="50" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901169 4907 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901174 4907 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901179 4907 flags.go:64] FLAG: --eviction-hard="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901186 4907 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901192 4907 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901197 4907 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901202 4907 flags.go:64] FLAG: --eviction-soft="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901207 4907 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901212 4907 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901217 4907 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901222 4907 flags.go:64] FLAG: --experimental-mounter-path="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901227 4907 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901232 4907 flags.go:64] FLAG: --fail-swap-on="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901237 4907 flags.go:64] FLAG: --feature-gates="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901243 4907 flags.go:64] FLAG: --file-check-frequency="20s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901248 4907 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901254 4907 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901259 4907 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901264 4907 flags.go:64] FLAG: --healthz-port="10248" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901269 4907 flags.go:64] FLAG: --help="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901275 4907 flags.go:64] FLAG: --hostname-override="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901280 4907 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901285 4907 flags.go:64] FLAG: --http-check-frequency="20s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901291 4907 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901296 4907 flags.go:64] FLAG: --image-credential-provider-config="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901301 4907 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901306 4907 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901311 4907 flags.go:64] FLAG: --image-service-endpoint="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901316 4907 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901321 4907 flags.go:64] FLAG: --kube-api-burst="100" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901326 4907 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901332 4907 flags.go:64] FLAG: --kube-api-qps="50" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901338 4907 flags.go:64] FLAG: --kube-reserved="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901343 4907 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901348 4907 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901354 4907 flags.go:64] FLAG: --kubelet-cgroups="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901359 4907 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901364 4907 flags.go:64] FLAG: --lock-file="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901369 4907 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901374 4907 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901379 4907 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901387 4907 flags.go:64] FLAG: --log-json-split-stream="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901392 4907 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901397 4907 flags.go:64] FLAG: --log-text-split-stream="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901402 4907 flags.go:64] FLAG: --logging-format="text" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901407 4907 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901412 4907 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901417 4907 flags.go:64] FLAG: --manifest-url="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901422 4907 flags.go:64] FLAG: --manifest-url-header="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901429 4907 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901435 4907 flags.go:64] FLAG: --max-open-files="1000000" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901441 4907 flags.go:64] FLAG: --max-pods="110" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901446 4907 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901452 4907 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901457 4907 flags.go:64] FLAG: --memory-manager-policy="None" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901462 4907 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901467 4907 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901472 4907 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901478 4907 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901491 4907 flags.go:64] FLAG: --node-status-max-images="50" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901496 4907 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901501 4907 flags.go:64] FLAG: --oom-score-adj="-999" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901506 4907 flags.go:64] FLAG: --pod-cidr="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901511 4907 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901519 4907 flags.go:64] FLAG: --pod-manifest-path="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901524 4907 flags.go:64] FLAG: --pod-max-pids="-1" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901530 4907 flags.go:64] FLAG: --pods-per-core="0" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901535 4907 flags.go:64] FLAG: --port="10250" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901541 4907 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901547 4907 flags.go:64] FLAG: --provider-id="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901552 4907 flags.go:64] FLAG: --qos-reserved="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901557 4907 flags.go:64] FLAG: --read-only-port="10255" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901563 4907 flags.go:64] FLAG: --register-node="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901568 4907 flags.go:64] FLAG: --register-schedulable="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901573 4907 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901582 4907 flags.go:64] FLAG: --registry-burst="10" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901604 4907 flags.go:64] FLAG: --registry-qps="5" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901609 4907 flags.go:64] FLAG: --reserved-cpus="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901615 4907 flags.go:64] FLAG: --reserved-memory="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901622 4907 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901627 4907 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901632 4907 flags.go:64] FLAG: --rotate-certificates="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901638 4907 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901643 4907 flags.go:64] FLAG: --runonce="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901648 4907 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901653 4907 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901659 4907 flags.go:64] FLAG: --seccomp-default="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901664 4907 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901670 4907 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901675 4907 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901681 4907 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901686 4907 flags.go:64] FLAG: --storage-driver-password="root" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901692 4907 flags.go:64] FLAG: --storage-driver-secure="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901697 4907 flags.go:64] FLAG: --storage-driver-table="stats" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901702 4907 flags.go:64] FLAG: --storage-driver-user="root" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901707 4907 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901713 4907 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901718 4907 flags.go:64] FLAG: --system-cgroups="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901723 4907 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901732 4907 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901737 4907 flags.go:64] FLAG: --tls-cert-file="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901742 4907 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901748 4907 flags.go:64] FLAG: --tls-min-version="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901753 4907 flags.go:64] FLAG: --tls-private-key-file="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901760 4907 flags.go:64] FLAG: --topology-manager-policy="none" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901764 4907 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901769 4907 flags.go:64] FLAG: --topology-manager-scope="container" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901774 4907 flags.go:64] FLAG: --v="2" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901781 4907 flags.go:64] FLAG: --version="false" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901788 4907 flags.go:64] FLAG: --vmodule="" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901795 4907 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.901800 4907 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901916 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901924 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901929 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901934 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901938 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901943 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901947 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901956 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901961 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901965 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901970 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901974 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901982 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901988 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901993 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.901999 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902003 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902010 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902014 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902019 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902024 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902029 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902034 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902039 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902043 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902048 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902052 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902058 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902064 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902069 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902074 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902078 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902083 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902087 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902092 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902096 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902100 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902105 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902109 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902116 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902120 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902124 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902129 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902133 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902141 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902147 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902152 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902157 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902162 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902167 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902172 4907 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902177 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902183 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902188 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902194 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902198 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902203 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902207 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902211 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902216 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902220 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902224 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902229 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902233 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902239 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902244 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902248 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902253 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902257 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902261 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.902266 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.902973 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.916712 4907 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.916773 4907 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916904 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916927 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916937 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916947 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916956 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916965 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916973 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916980 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916988 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.916996 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917004 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917012 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917020 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917028 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917035 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917043 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917052 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917061 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917123 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917135 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917144 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917152 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917162 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917170 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917178 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917187 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917195 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917203 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917212 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917220 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917229 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917237 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917244 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917253 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917262 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917270 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917278 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917286 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917294 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917301 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917310 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917318 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917328 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917340 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917348 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917356 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917367 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917376 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917385 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917393 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917402 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917411 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917420 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917427 4907 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917435 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917443 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917454 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917463 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917471 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917479 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917487 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917495 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917503 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917512 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917519 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917527 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917535 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917543 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917550 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917558 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917567 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.917580 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917843 4907 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917859 4907 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917868 4907 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917876 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917885 4907 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917893 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917901 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917911 4907 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917921 4907 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917930 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917938 4907 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917947 4907 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917955 4907 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917963 4907 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917970 4907 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917978 4907 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917986 4907 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.917994 4907 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918001 4907 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918009 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918017 4907 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918025 4907 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918032 4907 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918041 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918049 4907 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918056 4907 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918065 4907 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918073 4907 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918081 4907 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918088 4907 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918096 4907 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918104 4907 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918112 4907 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918120 4907 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918128 4907 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918136 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918144 4907 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918152 4907 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918162 4907 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918171 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918179 4907 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918190 4907 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918200 4907 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918208 4907 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918216 4907 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918224 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918231 4907 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918240 4907 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918248 4907 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918255 4907 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918263 4907 feature_gate.go:330] unrecognized feature gate: Example Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918270 4907 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918278 4907 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918286 4907 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918294 4907 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918302 4907 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918310 4907 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918319 4907 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918327 4907 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918335 4907 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918342 4907 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918350 4907 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918359 4907 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918369 4907 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918378 4907 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918387 4907 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918395 4907 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918404 4907 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918412 4907 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918420 4907 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 26 15:42:17 crc kubenswrapper[4907]: W0226 15:42:17.918429 4907 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.918442 4907 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.918748 4907 server.go:940] "Client rotation is on, will bootstrap in background" Feb 26 15:42:17 crc kubenswrapper[4907]: E0226 15:42:17.923697 4907 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.927391 4907 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.929123 4907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.932321 4907 server.go:997] "Starting client certificate rotation" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.932414 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.933453 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.961472 4907 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.964241 4907 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 15:42:17 crc kubenswrapper[4907]: E0226 15:42:17.965198 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:17 crc kubenswrapper[4907]: I0226 15:42:17.988327 4907 log.go:25] "Validated CRI v1 runtime API" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.027579 4907 log.go:25] "Validated CRI v1 image API" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.030522 4907 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.035256 4907 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-26-15-36-30-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.035289 4907 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.049743 4907 manager.go:217] Machine: {Timestamp:2026-02-26 15:42:18.046644203 +0000 UTC m=+0.565206072 CPUVendorID:AuthenticAMD NumCores:8 NumPhysicalCores:1 NumSockets:8 CpuFrequency:2799998 MemoryCapacity:25199480832 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7af7b453-01c3-4b8b-8c30-b1df8ce070ce BootID:16aec221-b9ec-4b79-ac12-986d05cb9b8b Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:3076108 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:12599738368 Type:vfs Inodes:3076108 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:5039898624 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:12599742464 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:2519945216 Type:vfs Inodes:615221 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:429496729600 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:11:db:3a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:11:db:3a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7f:b9:a4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:bd:7b:35 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:93:b0:24 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e2:c5:e9 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:56:65:8d:30:da:69 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ee:a9:2a:c0:1f:10 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:25199480832 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.049945 4907 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.050055 4907 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.051442 4907 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.051660 4907 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.051698 4907 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.051929 4907 topology_manager.go:138] "Creating topology manager with none policy" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.051942 4907 container_manager_linux.go:303] "Creating device plugin manager" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.052485 4907 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.052522 4907 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.053261 4907 state_mem.go:36] "Initialized new in-memory state store" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.053368 4907 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.056777 4907 kubelet.go:418] "Attempting to sync node with API server" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.056868 4907 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.056926 4907 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.056952 4907 kubelet.go:324] "Adding apiserver pod source" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.056972 4907 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.061811 4907 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.062934 4907 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.063048 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.063149 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.063213 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.063229 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.064205 4907 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065749 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065777 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065787 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065797 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065810 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065819 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065828 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065843 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065853 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065862 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065875 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.065885 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.067098 4907 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.067754 4907 server.go:1280] "Started kubelet" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.068749 4907 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.068931 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.068726 4907 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.069792 4907 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 26 15:42:18 crc systemd[1]: Started Kubernetes Kubelet. Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.071311 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.071450 4907 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.073235 4907 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.073277 4907 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.076839 4907 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.079644 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.080430 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.080511 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.080491 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="200ms" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.080714 4907 server.go:460] "Adding debug handlers to kubelet server" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.081051 4907 factory.go:55] Registering systemd factory Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.081080 4907 factory.go:221] Registration of the systemd container factory successfully Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.087730 4907 factory.go:153] Registering CRI-O factory Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.087784 4907 factory.go:221] Registration of the crio container factory successfully Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.087904 4907 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.087942 4907 factory.go:103] Registering Raw factory Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.087990 4907 manager.go:1196] Started watching for new ooms in manager Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.089310 4907 manager.go:319] Starting recovery of all containers Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.084142 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.210:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1897d63d82b3566a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,LastTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096229 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096330 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096376 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096391 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096408 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096420 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096433 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096450 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096466 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096477 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096486 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096498 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096510 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096525 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096539 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096552 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096563 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096575 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096600 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096613 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096625 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096634 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096646 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096677 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096693 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096705 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096724 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096737 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096750 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096760 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096788 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096802 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096815 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096829 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096841 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096857 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096876 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096892 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096908 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096920 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096931 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096943 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096954 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096965 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096978 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.096991 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097004 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097016 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097028 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097039 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097050 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097063 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097079 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097090 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097101 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097112 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097125 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097136 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097148 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097158 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097172 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097185 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097200 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097212 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097221 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097232 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097243 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097254 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097263 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097272 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097283 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097295 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097307 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097321 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097332 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097344 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097355 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097366 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097377 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097387 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097399 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097410 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097420 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097429 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097440 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097450 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097459 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097471 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097482 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097493 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097504 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097515 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097527 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097544 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097555 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097566 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097577 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097604 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097620 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097633 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097649 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097662 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097674 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097686 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097701 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097714 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097725 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097735 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097749 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097767 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097779 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097792 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097802 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097840 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097853 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097865 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097877 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097887 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097897 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097907 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097918 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097928 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097939 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097949 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097963 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097976 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.097990 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098002 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098017 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098032 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098043 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098055 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098064 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098076 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098087 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098099 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098109 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098119 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098129 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098140 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098151 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098162 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098174 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098185 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098197 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098210 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098221 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098234 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098242 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098252 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098264 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.098276 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100110 4907 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100135 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100148 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100166 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100175 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100190 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100203 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100216 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100230 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100241 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100251 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100262 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100315 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100331 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100340 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100353 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100362 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100372 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100382 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100393 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100403 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100416 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100428 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100437 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100448 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100458 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100470 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100480 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100490 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100502 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100513 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100524 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100538 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100553 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100568 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100583 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100610 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100622 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100636 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100647 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100660 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100670 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100713 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100723 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100736 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100751 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100765 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100789 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100803 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100815 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100825 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100838 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100850 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100865 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100878 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100889 4907 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100902 4907 reconstruct.go:97] "Volume reconstruction finished" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.100911 4907 reconciler.go:26] "Reconciler: start to sync state" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.107957 4907 manager.go:324] Recovery completed Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.117790 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.121170 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.121225 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.121236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.122706 4907 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.122731 4907 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.122757 4907 state_mem.go:36] "Initialized new in-memory state store" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.123436 4907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.125045 4907 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.125077 4907 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.125350 4907 kubelet.go:2335] "Starting kubelet main sync loop" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.126703 4907 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.127557 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.130454 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.140111 4907 policy_none.go:49] "None policy: Start" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.141335 4907 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.141374 4907 state_mem.go:35] "Initializing new in-memory state store" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.180511 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.186510 4907 manager.go:334] "Starting Device Plugin manager" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.186804 4907 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.186822 4907 server.go:79] "Starting device plugin registration server" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.187313 4907 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.187330 4907 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.187881 4907 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.187986 4907 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.187993 4907 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.198048 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.230468 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.230647 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.232929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.232979 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.232990 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.233142 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.233347 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.233401 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234272 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234286 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234311 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234359 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234569 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234770 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234789 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.234791 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235681 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235708 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235739 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235792 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235809 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235748 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.235993 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.236445 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.236489 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.236946 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.237008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.237022 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.237296 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.237508 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.237557 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.238691 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.238718 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.238732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.238912 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.238935 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.238947 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239110 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239141 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239280 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.239998 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.240013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.281974 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="400ms" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.288201 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.289461 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.289506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.289527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.289570 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.290051 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303349 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303404 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303453 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303500 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303544 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303586 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303673 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303769 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303854 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.303927 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.304026 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.304065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.304108 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.304205 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405215 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405314 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405411 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405488 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405515 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405618 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405649 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405707 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405743 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405836 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.405953 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406001 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406126 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406160 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406263 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406295 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406356 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406367 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406408 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406453 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406475 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406494 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.406498 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.490478 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.492570 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.492623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.492634 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.492656 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.493156 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.565675 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.572966 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.579398 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.600352 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.604430 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.625860 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e2c40279cadd89ca556923fd12706819883ad7db57350ce1120d6c7721320ae4 WatchSource:0}: Error finding container e2c40279cadd89ca556923fd12706819883ad7db57350ce1120d6c7721320ae4: Status 404 returned error can't find the container with id e2c40279cadd89ca556923fd12706819883ad7db57350ce1120d6c7721320ae4 Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.628511 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-724799a1667f181ccbe96f19632efd69370c2266a676adcdac8bac532fb034e7 WatchSource:0}: Error finding container 724799a1667f181ccbe96f19632efd69370c2266a676adcdac8bac532fb034e7: Status 404 returned error can't find the container with id 724799a1667f181ccbe96f19632efd69370c2266a676adcdac8bac532fb034e7 Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.635215 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c3585fa42a17b74ddb8832017975c7c4728ac840b8741bb42b02ef8abda865c7 WatchSource:0}: Error finding container c3585fa42a17b74ddb8832017975c7c4728ac840b8741bb42b02ef8abda865c7: Status 404 returned error can't find the container with id c3585fa42a17b74ddb8832017975c7c4728ac840b8741bb42b02ef8abda865c7 Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.635973 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-954e19043d5301fef33b5626f15d90b567dcd15399cd069bc6f504ab64e78ee6 WatchSource:0}: Error finding container 954e19043d5301fef33b5626f15d90b567dcd15399cd069bc6f504ab64e78ee6: Status 404 returned error can't find the container with id 954e19043d5301fef33b5626f15d90b567dcd15399cd069bc6f504ab64e78ee6 Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.683641 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="800ms" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.893782 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.894985 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.895021 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.895031 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:18 crc kubenswrapper[4907]: I0226 15:42:18.895058 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.895446 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.926276 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.926388 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:18 crc kubenswrapper[4907]: W0226 15:42:18.993941 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:18 crc kubenswrapper[4907]: E0226 15:42:18.994023 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:19 crc kubenswrapper[4907]: W0226 15:42:19.067247 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:19 crc kubenswrapper[4907]: E0226 15:42:19.067349 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.070116 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.136530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"724799a1667f181ccbe96f19632efd69370c2266a676adcdac8bac532fb034e7"} Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.138199 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e2c40279cadd89ca556923fd12706819883ad7db57350ce1120d6c7721320ae4"} Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.139281 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"937393c48c13fbed01a4f59245730c3a35dbc8fe5f7b1baae9be9488c1bd93f7"} Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.140395 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"954e19043d5301fef33b5626f15d90b567dcd15399cd069bc6f504ab64e78ee6"} Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.141326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c3585fa42a17b74ddb8832017975c7c4728ac840b8741bb42b02ef8abda865c7"} Feb 26 15:42:19 crc kubenswrapper[4907]: E0226 15:42:19.485005 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="1.6s" Feb 26 15:42:19 crc kubenswrapper[4907]: W0226 15:42:19.508951 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:19 crc kubenswrapper[4907]: E0226 15:42:19.509090 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.695751 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.697167 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.697236 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.697257 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:19 crc kubenswrapper[4907]: I0226 15:42:19.697294 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:19 crc kubenswrapper[4907]: E0226 15:42:19.697894 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.070336 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.148514 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.148577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.148628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.148646 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.148694 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.150211 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.150249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.150261 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.151502 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8" exitCode=0 Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.151649 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.151658 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.153054 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.153125 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.153141 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.154115 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19" exitCode=0 Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.154176 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.154303 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.155752 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.155773 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.155784 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.157408 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.158314 4907 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d" exitCode=0 Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.158372 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.158456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.158498 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.158518 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.158469 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.159543 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.159569 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.159581 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.161635 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699" exitCode=0 Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.161669 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699"} Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.161798 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.162922 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.162965 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.163008 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.163261 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:42:20 crc kubenswrapper[4907]: E0226 15:42:20.164113 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:20 crc kubenswrapper[4907]: I0226 15:42:20.347345 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.070362 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:21 crc kubenswrapper[4907]: E0226 15:42:21.086282 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="3.2s" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.172946 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.173087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.173102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.173111 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.177767 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.177794 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.177804 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.177883 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.179562 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.179601 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.179611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.182128 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886" exitCode=0 Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.182169 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.182241 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.182789 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.182803 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.182811 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.184658 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.185017 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.185424 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d"} Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.186154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.186192 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.186203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.187047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.187071 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.187080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:21 crc kubenswrapper[4907]: W0226 15:42:21.213132 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:21 crc kubenswrapper[4907]: E0226 15:42:21.213206 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.298215 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.299152 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.299181 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.299190 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.299211 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:21 crc kubenswrapper[4907]: E0226 15:42:21.299529 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.210:6443: connect: connection refused" node="crc" Feb 26 15:42:21 crc kubenswrapper[4907]: W0226 15:42:21.339107 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:21 crc kubenswrapper[4907]: E0226 15:42:21.339168 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:21 crc kubenswrapper[4907]: W0226 15:42:21.407681 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:42:21 crc kubenswrapper[4907]: E0226 15:42:21.407759 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.210:6443: connect: connection refused" logger="UnhandledError" Feb 26 15:42:21 crc kubenswrapper[4907]: I0226 15:42:21.795772 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.190032 4907 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a" exitCode=0 Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.190130 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a"} Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.190202 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.191198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.191244 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.191254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.193454 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.196727 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b9ec6de1274f730e08101d063d062410c2904829d95f506620437ccff5aa53bb" exitCode=255 Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.196878 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.196906 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.196924 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.196943 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.196921 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b9ec6de1274f730e08101d063d062410c2904829d95f506620437ccff5aa53bb"} Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.197369 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.197964 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.197993 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198005 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198019 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198037 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198107 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198139 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198154 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198233 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198265 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198278 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.198924 4907 scope.go:117] "RemoveContainer" containerID="b9ec6de1274f730e08101d063d062410c2904829d95f506620437ccff5aa53bb" Feb 26 15:42:22 crc kubenswrapper[4907]: I0226 15:42:22.912269 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.201689 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.203449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41"} Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.203515 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.204536 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.204584 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.204613 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.209758 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210265 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b642a813d8a9d885593d5dd495ed461119f14e1c1937844b64196bb55dd67e24"} Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210337 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7a65767b486307851169c93586cffb785a0977b0ca654dc7bc6fd38ce349d5f0"} Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2d6cb50daf3d05a3e4b4427361206adaeb990478e437b697db9a2716fbc0a3e0"} Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210368 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d111022be1d13de640f2fe6f3683455c1defed82f3c06fb63c8b84d2feea1182"} Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210859 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210895 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.210908 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:23 crc kubenswrapper[4907]: I0226 15:42:23.360687 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.217343 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91e03a798a371431d5f0e490e8ffe260ea101ae6a41f56f9ee2d37c2ed255f1d"} Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.217362 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.217452 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.217469 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.218701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.218740 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.218758 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.219537 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.219573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.219623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.250181 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.500520 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.501923 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.501983 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.502000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.502036 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.636767 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.637072 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.638802 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.638856 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:24 crc kubenswrapper[4907]: I0226 15:42:24.638876 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.219797 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.219938 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.219803 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.221385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.221440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.221453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.221388 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.221499 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:25 crc kubenswrapper[4907]: I0226 15:42:25.221517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:26 crc kubenswrapper[4907]: I0226 15:42:26.478022 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:26 crc kubenswrapper[4907]: I0226 15:42:26.478240 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:26 crc kubenswrapper[4907]: I0226 15:42:26.479911 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:26 crc kubenswrapper[4907]: I0226 15:42:26.479963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:26 crc kubenswrapper[4907]: I0226 15:42:26.479975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:26 crc kubenswrapper[4907]: I0226 15:42:26.482871 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.155187 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.155478 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.160525 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.160641 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.160671 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.194544 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.194809 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.196111 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.196204 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.196229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.225520 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.226976 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.227044 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.227066 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:27 crc kubenswrapper[4907]: I0226 15:42:27.988144 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:28 crc kubenswrapper[4907]: E0226 15:42:28.199155 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:28 crc kubenswrapper[4907]: I0226 15:42:28.227115 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:28 crc kubenswrapper[4907]: I0226 15:42:28.227920 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:28 crc kubenswrapper[4907]: I0226 15:42:28.227956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:28 crc kubenswrapper[4907]: I0226 15:42:28.227967 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:30 crc kubenswrapper[4907]: I0226 15:42:30.988413 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:42:30 crc kubenswrapper[4907]: I0226 15:42:30.988532 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.598399 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.598640 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.599889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.599924 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.599939 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:31 crc kubenswrapper[4907]: W0226 15:42:31.730610 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.730716 4907 trace.go:236] Trace[1076908886]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (26-Feb-2026 15:42:21.728) (total time: 10001ms): Feb 26 15:42:31 crc kubenswrapper[4907]: Trace[1076908886]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:42:31.730) Feb 26 15:42:31 crc kubenswrapper[4907]: Trace[1076908886]: [10.001782891s] [10.001782891s] END Feb 26 15:42:31 crc kubenswrapper[4907]: E0226 15:42:31.730743 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.800324 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.800452 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.801524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.801567 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:31 crc kubenswrapper[4907]: I0226 15:42:31.801580 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.071136 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.252948 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.256471 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.259004 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897d63d82b3566a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,LastTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.260202 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:42:32 crc kubenswrapper[4907]: W0226 15:42:32.264230 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.264567 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:32 crc kubenswrapper[4907]: W0226 15:42:32.267191 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.267313 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.268276 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.268342 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 15:42:32 crc kubenswrapper[4907]: W0226 15:42:32.273169 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z Feb 26 15:42:32 crc kubenswrapper[4907]: E0226 15:42:32.273224 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.275936 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.275988 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.920367 4907 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]log ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]etcd ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/generic-apiserver-start-informers ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/priority-and-fairness-filter ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-apiextensions-informers ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-apiextensions-controllers ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/crd-informer-synced ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-system-namespaces-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 26 15:42:32 crc kubenswrapper[4907]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 26 15:42:32 crc kubenswrapper[4907]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/bootstrap-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/start-kube-aggregator-informers ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-registration-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-discovery-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]autoregister-completion ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-openapi-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 26 15:42:32 crc kubenswrapper[4907]: livez check failed Feb 26 15:42:32 crc kubenswrapper[4907]: I0226 15:42:32.920421 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.072989 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:33Z is after 2026-02-23T05:33:13Z Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.256905 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.257674 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.259216 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" exitCode=255 Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.259255 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41"} Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.259333 4907 scope.go:117] "RemoveContainer" containerID="b9ec6de1274f730e08101d063d062410c2904829d95f506620437ccff5aa53bb" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.259491 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.260849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.260915 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.260942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:33 crc kubenswrapper[4907]: I0226 15:42:33.263870 4907 scope.go:117] "RemoveContainer" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" Feb 26 15:42:33 crc kubenswrapper[4907]: E0226 15:42:33.264486 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:34 crc kubenswrapper[4907]: I0226 15:42:34.075025 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:34Z is after 2026-02-23T05:33:13Z Feb 26 15:42:34 crc kubenswrapper[4907]: I0226 15:42:34.264426 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:42:35 crc kubenswrapper[4907]: I0226 15:42:35.073967 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:35Z is after 2026-02-23T05:33:13Z Feb 26 15:42:36 crc kubenswrapper[4907]: I0226 15:42:36.074458 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:36Z is after 2026-02-23T05:33:13Z Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.075124 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:37Z is after 2026-02-23T05:33:13Z Feb 26 15:42:37 crc kubenswrapper[4907]: W0226 15:42:37.850638 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:37Z is after 2026-02-23T05:33:13Z Feb 26 15:42:37 crc kubenswrapper[4907]: E0226 15:42:37.850717 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.920689 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.920886 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.922507 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.922611 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.922631 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.923443 4907 scope.go:117] "RemoveContainer" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" Feb 26 15:42:37 crc kubenswrapper[4907]: E0226 15:42:37.923789 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:37 crc kubenswrapper[4907]: I0226 15:42:37.928485 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.075937 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:38Z is after 2026-02-23T05:33:13Z Feb 26 15:42:38 crc kubenswrapper[4907]: E0226 15:42:38.199248 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.278887 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.281203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.281242 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.281254 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.281881 4907 scope.go:117] "RemoveContainer" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" Feb 26 15:42:38 crc kubenswrapper[4907]: E0226 15:42:38.282088 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.660877 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:38 crc kubenswrapper[4907]: E0226 15:42:38.660968 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:38Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.662423 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.662622 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.662768 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:38 crc kubenswrapper[4907]: I0226 15:42:38.662926 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:38 crc kubenswrapper[4907]: E0226 15:42:38.666705 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:38Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:42:39 crc kubenswrapper[4907]: I0226 15:42:39.071670 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:39Z is after 2026-02-23T05:33:13Z Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.072908 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:40Z is after 2026-02-23T05:33:13Z Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.471540 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:42:40 crc kubenswrapper[4907]: E0226 15:42:40.475757 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.647628 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.647926 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.649399 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.649458 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.649480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.650395 4907 scope.go:117] "RemoveContainer" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" Feb 26 15:42:40 crc kubenswrapper[4907]: E0226 15:42:40.650777 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.988349 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:42:40 crc kubenswrapper[4907]: I0226 15:42:40.988434 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.074283 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:41Z is after 2026-02-23T05:33:13Z Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.642114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.642392 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.643834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.643885 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.643902 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:41 crc kubenswrapper[4907]: I0226 15:42:41.667003 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 26 15:42:42 crc kubenswrapper[4907]: I0226 15:42:42.074497 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:42Z is after 2026-02-23T05:33:13Z Feb 26 15:42:42 crc kubenswrapper[4907]: E0226 15:42:42.264626 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:42Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897d63d82b3566a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,LastTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:42:42 crc kubenswrapper[4907]: I0226 15:42:42.289043 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:42 crc kubenswrapper[4907]: I0226 15:42:42.290169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:42 crc kubenswrapper[4907]: I0226 15:42:42.290218 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:42 crc kubenswrapper[4907]: I0226 15:42:42.290232 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:43 crc kubenswrapper[4907]: I0226 15:42:43.074972 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:43Z is after 2026-02-23T05:33:13Z Feb 26 15:42:43 crc kubenswrapper[4907]: W0226 15:42:43.144507 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:43Z is after 2026-02-23T05:33:13Z Feb 26 15:42:43 crc kubenswrapper[4907]: E0226 15:42:43.144648 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:43Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:44 crc kubenswrapper[4907]: I0226 15:42:44.072463 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:44Z is after 2026-02-23T05:33:13Z Feb 26 15:42:44 crc kubenswrapper[4907]: W0226 15:42:44.147431 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:44Z is after 2026-02-23T05:33:13Z Feb 26 15:42:44 crc kubenswrapper[4907]: E0226 15:42:44.147523 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:44 crc kubenswrapper[4907]: W0226 15:42:44.731573 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:44Z is after 2026-02-23T05:33:13Z Feb 26 15:42:44 crc kubenswrapper[4907]: E0226 15:42:44.731745 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:45 crc kubenswrapper[4907]: I0226 15:42:45.073050 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:45Z is after 2026-02-23T05:33:13Z Feb 26 15:42:45 crc kubenswrapper[4907]: E0226 15:42:45.665984 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:45Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:42:45 crc kubenswrapper[4907]: I0226 15:42:45.667172 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:45 crc kubenswrapper[4907]: I0226 15:42:45.668808 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:45 crc kubenswrapper[4907]: I0226 15:42:45.668861 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:45 crc kubenswrapper[4907]: I0226 15:42:45.668878 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:45 crc kubenswrapper[4907]: I0226 15:42:45.668913 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:45 crc kubenswrapper[4907]: E0226 15:42:45.677226 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:45Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:42:46 crc kubenswrapper[4907]: I0226 15:42:46.074855 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:46Z is after 2026-02-23T05:33:13Z Feb 26 15:42:47 crc kubenswrapper[4907]: I0226 15:42:47.075799 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:47Z is after 2026-02-23T05:33:13Z Feb 26 15:42:48 crc kubenswrapper[4907]: I0226 15:42:48.073136 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:48Z is after 2026-02-23T05:33:13Z Feb 26 15:42:48 crc kubenswrapper[4907]: E0226 15:42:48.199692 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:49 crc kubenswrapper[4907]: I0226 15:42:49.073038 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z Feb 26 15:42:50 crc kubenswrapper[4907]: W0226 15:42:50.024547 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:50Z is after 2026-02-23T05:33:13Z Feb 26 15:42:50 crc kubenswrapper[4907]: E0226 15:42:50.024744 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:50Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.075023 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:50Z is after 2026-02-23T05:33:13Z Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.790086 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44102->192.168.126.11:10357: read: connection reset by peer" start-of-body= Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.791145 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44102->192.168.126.11:10357: read: connection reset by peer" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.791242 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.791438 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.792786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.792837 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.792853 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.793490 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 26 15:42:50 crc kubenswrapper[4907]: I0226 15:42:50.793667 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8" gracePeriod=30 Feb 26 15:42:51 crc kubenswrapper[4907]: I0226 15:42:51.072923 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:51Z is after 2026-02-23T05:33:13Z Feb 26 15:42:51 crc kubenswrapper[4907]: I0226 15:42:51.320997 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:42:51 crc kubenswrapper[4907]: I0226 15:42:51.321523 4907 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8" exitCode=255 Feb 26 15:42:51 crc kubenswrapper[4907]: I0226 15:42:51.321639 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8"} Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.075575 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:52Z is after 2026-02-23T05:33:13Z Feb 26 15:42:52 crc kubenswrapper[4907]: E0226 15:42:52.271110 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:52Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897d63d82b3566a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,LastTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.326849 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.327391 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6"} Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.327474 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.329046 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.329121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.329145 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[4907]: E0226 15:42:52.671403 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:52Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.677555 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.679625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.679689 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.679705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:52 crc kubenswrapper[4907]: I0226 15:42:52.679751 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:52 crc kubenswrapper[4907]: E0226 15:42:52.683919 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:52Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:42:53 crc kubenswrapper[4907]: I0226 15:42:53.074874 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:53Z is after 2026-02-23T05:33:13Z Feb 26 15:42:53 crc kubenswrapper[4907]: I0226 15:42:53.331091 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:53 crc kubenswrapper[4907]: I0226 15:42:53.332582 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:53 crc kubenswrapper[4907]: I0226 15:42:53.332633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:53 crc kubenswrapper[4907]: I0226 15:42:53.332642 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[4907]: I0226 15:42:54.072951 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:54Z is after 2026-02-23T05:33:13Z Feb 26 15:42:54 crc kubenswrapper[4907]: I0226 15:42:54.126368 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:54 crc kubenswrapper[4907]: I0226 15:42:54.127764 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:54 crc kubenswrapper[4907]: I0226 15:42:54.127889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:54 crc kubenswrapper[4907]: I0226 15:42:54.127919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:54 crc kubenswrapper[4907]: I0226 15:42:54.128760 4907 scope.go:117] "RemoveContainer" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.075156 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:55Z is after 2026-02-23T05:33:13Z Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.338031 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.338823 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.341057 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" exitCode=255 Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.341116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae"} Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.341205 4907 scope.go:117] "RemoveContainer" containerID="ce7948f4131a4af7900463c6c6d198695a36a2be974fcc500a01a06e1ffcba41" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.341352 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.343193 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.343262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.343285 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:55 crc kubenswrapper[4907]: I0226 15:42:55.344243 4907 scope.go:117] "RemoveContainer" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" Feb 26 15:42:55 crc kubenswrapper[4907]: E0226 15:42:55.344550 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:56 crc kubenswrapper[4907]: I0226 15:42:56.075316 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:56Z is after 2026-02-23T05:33:13Z Feb 26 15:42:56 crc kubenswrapper[4907]: I0226 15:42:56.345411 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:42:56 crc kubenswrapper[4907]: I0226 15:42:56.921552 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:42:56 crc kubenswrapper[4907]: E0226 15:42:56.925384 4907 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:42:56 crc kubenswrapper[4907]: E0226 15:42:56.926668 4907 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.074986 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:57Z is after 2026-02-23T05:33:13Z Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.155630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.155889 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.157369 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.157441 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.157459 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.158471 4907 scope.go:117] "RemoveContainer" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" Feb 26 15:42:57 crc kubenswrapper[4907]: E0226 15:42:57.158851 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.989087 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.989299 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.990995 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.991050 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:57 crc kubenswrapper[4907]: I0226 15:42:57.991075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:58 crc kubenswrapper[4907]: I0226 15:42:58.078704 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:58Z is after 2026-02-23T05:33:13Z Feb 26 15:42:58 crc kubenswrapper[4907]: E0226 15:42:58.199877 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:42:59 crc kubenswrapper[4907]: I0226 15:42:59.074297 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:59Z is after 2026-02-23T05:33:13Z Feb 26 15:42:59 crc kubenswrapper[4907]: E0226 15:42:59.677672 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:59Z is after 2026-02-23T05:33:13Z" interval="7s" Feb 26 15:42:59 crc kubenswrapper[4907]: I0226 15:42:59.684965 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:42:59 crc kubenswrapper[4907]: I0226 15:42:59.686524 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:42:59 crc kubenswrapper[4907]: I0226 15:42:59.686610 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:42:59 crc kubenswrapper[4907]: I0226 15:42:59.686633 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:42:59 crc kubenswrapper[4907]: I0226 15:42:59.686663 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:42:59 crc kubenswrapper[4907]: E0226 15:42:59.689934 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:59Z is after 2026-02-23T05:33:13Z" node="crc" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.072975 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:00Z is after 2026-02-23T05:33:13Z Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.347743 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.348022 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.349346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.349391 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.349407 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.647331 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.647622 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.649305 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.649366 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.649382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.650242 4907 scope.go:117] "RemoveContainer" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" Feb 26 15:43:00 crc kubenswrapper[4907]: E0226 15:43:00.650456 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.989705 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:43:00 crc kubenswrapper[4907]: I0226 15:43:00.989797 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:43:01 crc kubenswrapper[4907]: I0226 15:43:01.074482 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:01Z is after 2026-02-23T05:33:13Z Feb 26 15:43:02 crc kubenswrapper[4907]: I0226 15:43:02.074822 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:02Z is after 2026-02-23T05:33:13Z Feb 26 15:43:02 crc kubenswrapper[4907]: E0226 15:43:02.278159 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:02Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.1897d63d82b3566a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,LastTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:02 crc kubenswrapper[4907]: W0226 15:43:02.517337 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:02Z is after 2026-02-23T05:33:13Z Feb 26 15:43:02 crc kubenswrapper[4907]: E0226 15:43:02.517443 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:43:03 crc kubenswrapper[4907]: I0226 15:43:03.073049 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:03Z is after 2026-02-23T05:33:13Z Feb 26 15:43:03 crc kubenswrapper[4907]: W0226 15:43:03.539073 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:03Z is after 2026-02-23T05:33:13Z Feb 26 15:43:03 crc kubenswrapper[4907]: E0226 15:43:03.539185 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:43:03 crc kubenswrapper[4907]: W0226 15:43:03.569877 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:03Z is after 2026-02-23T05:33:13Z Feb 26 15:43:03 crc kubenswrapper[4907]: E0226 15:43:03.569979 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 26 15:43:04 crc kubenswrapper[4907]: I0226 15:43:04.072845 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:43:04Z is after 2026-02-23T05:33:13Z Feb 26 15:43:05 crc kubenswrapper[4907]: I0226 15:43:05.074957 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:06 crc kubenswrapper[4907]: I0226 15:43:06.075154 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:06 crc kubenswrapper[4907]: W0226 15:43:06.315474 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 26 15:43:06 crc kubenswrapper[4907]: E0226 15:43:06.315557 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 26 15:43:06 crc kubenswrapper[4907]: E0226 15:43:06.683762 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:43:06 crc kubenswrapper[4907]: I0226 15:43:06.690902 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:06 crc kubenswrapper[4907]: I0226 15:43:06.692199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:06 crc kubenswrapper[4907]: I0226 15:43:06.692325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:06 crc kubenswrapper[4907]: I0226 15:43:06.692409 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:06 crc kubenswrapper[4907]: I0226 15:43:06.692507 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:43:06 crc kubenswrapper[4907]: E0226 15:43:06.699329 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:43:07 crc kubenswrapper[4907]: I0226 15:43:07.073879 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:08 crc kubenswrapper[4907]: I0226 15:43:08.074163 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:08 crc kubenswrapper[4907]: E0226 15:43:08.200038 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:43:09 crc kubenswrapper[4907]: I0226 15:43:09.074154 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:10 crc kubenswrapper[4907]: I0226 15:43:10.077555 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:10 crc kubenswrapper[4907]: I0226 15:43:10.989960 4907 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:43:10 crc kubenswrapper[4907]: I0226 15:43:10.990118 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:43:11 crc kubenswrapper[4907]: I0226 15:43:11.075151 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:12 crc kubenswrapper[4907]: I0226 15:43:12.073924 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.283737 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d82b3566a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,LastTimestamp:2026-02-26 15:42:18.067719786 +0000 UTC m=+0.586281645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.287381 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.291509 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.296442 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.300707 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d8ada5930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.204494128 +0000 UTC m=+0.723055987,LastTimestamp:2026-02-26 15:42:18.204494128 +0000 UTC m=+0.723055987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.306119 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.232960401 +0000 UTC m=+0.751522250,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.309759 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.232985502 +0000 UTC m=+0.751547341,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.313684 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e401bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.232994662 +0000 UTC m=+0.751556511,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.318346 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.234259973 +0000 UTC m=+0.752821832,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.323686 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.234279993 +0000 UTC m=+0.752841852,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.327431 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e401bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.234292833 +0000 UTC m=+0.752854682,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.331034 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.234350845 +0000 UTC m=+0.752912704,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.335659 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.23456317 +0000 UTC m=+0.753125039,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.339855 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e401bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.2345773 +0000 UTC m=+0.753139159,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.344144 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.235697908 +0000 UTC m=+0.754259757,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.348092 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.235730879 +0000 UTC m=+0.754292728,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.351990 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.23577723 +0000 UTC m=+0.754339079,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.355853 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.23580431 +0000 UTC m=+0.754366159,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.360252 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e401bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.235814631 +0000 UTC m=+0.754376480,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.363663 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e401bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.235864752 +0000 UTC m=+0.754426621,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.369231 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.236974669 +0000 UTC m=+0.755536518,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.373525 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.23701749 +0000 UTC m=+0.755579339,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.377064 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e401bc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e401bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.12124102 +0000 UTC m=+0.639802869,LastTimestamp:2026-02-26 15:42:18.23702778 +0000 UTC m=+0.755589629,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.380137 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e39d11\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e39d11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.121215249 +0000 UTC m=+0.639777098,LastTimestamp:2026-02-26 15:42:18.238710481 +0000 UTC m=+0.757272340,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.383380 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.1897d63d85e3e204\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.1897d63d85e3e204 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.1212329 +0000 UTC m=+0.639794749,LastTimestamp:2026-02-26 15:42:18.238726221 +0000 UTC m=+0.757288080,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.387478 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63da4c2ab5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.639149917 +0000 UTC m=+1.157711766,LastTimestamp:2026-02-26 15:42:18.639149917 +0000 UTC m=+1.157711766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.395259 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63da4c39c12 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.639211538 +0000 UTC m=+1.157773427,LastTimestamp:2026-02-26 15:42:18.639211538 +0000 UTC m=+1.157773427,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.399237 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d63da4d0a073 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.640064627 +0000 UTC m=+1.158626516,LastTimestamp:2026-02-26 15:42:18.640064627 +0000 UTC m=+1.158626516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.402875 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63da4e909ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.641664462 +0000 UTC m=+1.160226311,LastTimestamp:2026-02-26 15:42:18.641664462 +0000 UTC m=+1.160226311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.406273 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63da57510a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:18.650841252 +0000 UTC m=+1.169403111,LastTimestamp:2026-02-26 15:42:18.650841252 +0000 UTC m=+1.169403111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.410379 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63dcaa64997 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.274824087 +0000 UTC m=+1.793385986,LastTimestamp:2026-02-26 15:42:19.274824087 +0000 UTC m=+1.793385986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.413395 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d63dcaa9569c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.275024028 +0000 UTC m=+1.793585887,LastTimestamp:2026-02-26 15:42:19.275024028 +0000 UTC m=+1.793585887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.415298 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63dcadb77dd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.278309341 +0000 UTC m=+1.796871210,LastTimestamp:2026-02-26 15:42:19.278309341 +0000 UTC m=+1.796871210,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.417324 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63dcb2dcc98 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.283704984 +0000 UTC m=+1.802266853,LastTimestamp:2026-02-26 15:42:19.283704984 +0000 UTC m=+1.802266853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.421211 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63dcb354d47 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.284196679 +0000 UTC m=+1.802758568,LastTimestamp:2026-02-26 15:42:19.284196679 +0000 UTC m=+1.802758568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.423496 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63dcb93a5c8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.29037972 +0000 UTC m=+1.808941579,LastTimestamp:2026-02-26 15:42:19.29037972 +0000 UTC m=+1.808941579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.425553 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63dcbc03b90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.293301648 +0000 UTC m=+1.811863517,LastTimestamp:2026-02-26 15:42:19.293301648 +0000 UTC m=+1.811863517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.427110 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63dcbef31fd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.296379389 +0000 UTC m=+1.814941248,LastTimestamp:2026-02-26 15:42:19.296379389 +0000 UTC m=+1.814941248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.430215 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d63dcc1b35fe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.299263998 +0000 UTC m=+1.817825887,LastTimestamp:2026-02-26 15:42:19.299263998 +0000 UTC m=+1.817825887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.434237 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63dcc3af61e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.301344798 +0000 UTC m=+1.819906657,LastTimestamp:2026-02-26 15:42:19.301344798 +0000 UTC m=+1.819906657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.437699 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63dcc5733e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.303195616 +0000 UTC m=+1.821757485,LastTimestamp:2026-02-26 15:42:19.303195616 +0000 UTC m=+1.821757485,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.441151 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63ddf823c23 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.624782883 +0000 UTC m=+2.143344742,LastTimestamp:2026-02-26 15:42:19.624782883 +0000 UTC m=+2.143344742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.445145 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63de015c9b6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.634452918 +0000 UTC m=+2.153014757,LastTimestamp:2026-02-26 15:42:19.634452918 +0000 UTC m=+2.153014757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.449377 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63de026b62d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.635562029 +0000 UTC m=+2.154123878,LastTimestamp:2026-02-26 15:42:19.635562029 +0000 UTC m=+2.154123878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.454884 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63deabaef84 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.813048196 +0000 UTC m=+2.331610045,LastTimestamp:2026-02-26 15:42:19.813048196 +0000 UTC m=+2.331610045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.460262 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63debcda572 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.831051634 +0000 UTC m=+2.349613483,LastTimestamp:2026-02-26 15:42:19.831051634 +0000 UTC m=+2.349613483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.464023 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63debdb3f79 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.831943033 +0000 UTC m=+2.350504882,LastTimestamp:2026-02-26 15:42:19.831943033 +0000 UTC m=+2.350504882,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.468564 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63df8376f83 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.039311235 +0000 UTC m=+2.557873114,LastTimestamp:2026-02-26 15:42:20.039311235 +0000 UTC m=+2.557873114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.472610 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63df8d699e4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.049742308 +0000 UTC m=+2.568304147,LastTimestamp:2026-02-26 15:42:20.049742308 +0000 UTC m=+2.568304147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.476981 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d63dff33ccbd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.156513469 +0000 UTC m=+2.675075328,LastTimestamp:2026-02-26 15:42:20.156513469 +0000 UTC m=+2.675075328,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.480567 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63dff3ef054 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.157243476 +0000 UTC m=+2.675805325,LastTimestamp:2026-02-26 15:42:20.157243476 +0000 UTC m=+2.675805325,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.484333 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63dff7790b9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.160954553 +0000 UTC m=+2.679516442,LastTimestamp:2026-02-26 15:42:20.160954553 +0000 UTC m=+2.679516442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.488517 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63dffd23b07 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.166896391 +0000 UTC m=+2.685458240,LastTimestamp:2026-02-26 15:42:20.166896391 +0000 UTC m=+2.685458240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.491929 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e0d77731c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.395827996 +0000 UTC m=+2.914389845,LastTimestamp:2026-02-26 15:42:20.395827996 +0000 UTC m=+2.914389845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.495300 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d63e0d8275b0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.396549552 +0000 UTC m=+2.915111401,LastTimestamp:2026-02-26 15:42:20.396549552 +0000 UTC m=+2.915111401,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.499870 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e0deb9778 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.40343948 +0000 UTC m=+2.922001329,LastTimestamp:2026-02-26 15:42:20.40343948 +0000 UTC m=+2.922001329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.503778 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e0e4aa31e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.409668382 +0000 UTC m=+2.928230231,LastTimestamp:2026-02-26 15:42:20.409668382 +0000 UTC m=+2.928230231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.507657 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897d63e0e665652 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.41148373 +0000 UTC m=+2.930045579,LastTimestamp:2026-02-26 15:42:20.41148373 +0000 UTC m=+2.930045579,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.511240 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e0f570007 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.427255815 +0000 UTC m=+2.945817664,LastTimestamp:2026-02-26 15:42:20.427255815 +0000 UTC m=+2.945817664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.515632 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e0f661d0d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.428246285 +0000 UTC m=+2.946808134,LastTimestamp:2026-02-26 15:42:20.428246285 +0000 UTC m=+2.946808134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.517966 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e0fade91d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.432951581 +0000 UTC m=+2.951513420,LastTimestamp:2026-02-26 15:42:20.432951581 +0000 UTC m=+2.951513420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.519403 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e10de6ab1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.452907697 +0000 UTC m=+2.971469546,LastTimestamp:2026-02-26 15:42:20.452907697 +0000 UTC m=+2.971469546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.521303 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e10ed5729 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.453885737 +0000 UTC m=+2.972447586,LastTimestamp:2026-02-26 15:42:20.453885737 +0000 UTC m=+2.972447586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.523377 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e19c8839e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.60246723 +0000 UTC m=+3.121029089,LastTimestamp:2026-02-26 15:42:20.60246723 +0000 UTC m=+3.121029089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.524563 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e1b1ce9ef openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.624775663 +0000 UTC m=+3.143337512,LastTimestamp:2026-02-26 15:42:20.624775663 +0000 UTC m=+3.143337512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.527308 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e1b342157 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.626297175 +0000 UTC m=+3.144859034,LastTimestamp:2026-02-26 15:42:20.626297175 +0000 UTC m=+3.144859034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.530545 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e1b4b3e24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.627811876 +0000 UTC m=+3.146373725,LastTimestamp:2026-02-26 15:42:20.627811876 +0000 UTC m=+3.146373725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.533487 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e1cde2a69 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.654217833 +0000 UTC m=+3.172779692,LastTimestamp:2026-02-26 15:42:20.654217833 +0000 UTC m=+3.172779692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.536356 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e1cef9b57 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.655360855 +0000 UTC m=+3.173922724,LastTimestamp:2026-02-26 15:42:20.655360855 +0000 UTC m=+3.173922724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.539355 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e2774ad52 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.831853906 +0000 UTC m=+3.350415745,LastTimestamp:2026-02-26 15:42:20.831853906 +0000 UTC m=+3.350415745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.542367 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e278a9cc1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.833291457 +0000 UTC m=+3.351853306,LastTimestamp:2026-02-26 15:42:20.833291457 +0000 UTC m=+3.351853306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.545491 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897d63e28dc0374 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.85540338 +0000 UTC m=+3.373965229,LastTimestamp:2026-02-26 15:42:20.85540338 +0000 UTC m=+3.373965229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.548450 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e2917b563 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.859315555 +0000 UTC m=+3.377877414,LastTimestamp:2026-02-26 15:42:20.859315555 +0000 UTC m=+3.377877414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.552504 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e29246925 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:20.860148005 +0000 UTC m=+3.378709854,LastTimestamp:2026-02-26 15:42:20.860148005 +0000 UTC m=+3.378709854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.555937 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e338f0fcd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.034909645 +0000 UTC m=+3.553471494,LastTimestamp:2026-02-26 15:42:21.034909645 +0000 UTC m=+3.553471494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.560269 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e3450e6fa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.047613178 +0000 UTC m=+3.566175037,LastTimestamp:2026-02-26 15:42:21.047613178 +0000 UTC m=+3.566175037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.563687 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e345ed897 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.048526999 +0000 UTC m=+3.567088848,LastTimestamp:2026-02-26 15:42:21.048526999 +0000 UTC m=+3.567088848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.567192 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e3c771156 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.184332118 +0000 UTC m=+3.702893967,LastTimestamp:2026-02-26 15:42:21.184332118 +0000 UTC m=+3.702893967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.570621 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e3e6bcabe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.217147582 +0000 UTC m=+3.735709431,LastTimestamp:2026-02-26 15:42:21.217147582 +0000 UTC m=+3.735709431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.574007 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e3f592580 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.232702848 +0000 UTC m=+3.751264727,LastTimestamp:2026-02-26 15:42:21.232702848 +0000 UTC m=+3.751264727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.577331 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e473fff72 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.365272434 +0000 UTC m=+3.883834283,LastTimestamp:2026-02-26 15:42:21.365272434 +0000 UTC m=+3.883834283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.581169 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e47ec5e40 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.376568896 +0000 UTC m=+3.895130765,LastTimestamp:2026-02-26 15:42:21.376568896 +0000 UTC m=+3.895130765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.584619 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e78933256 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.192808534 +0000 UTC m=+4.711370383,LastTimestamp:2026-02-26 15:42:22.192808534 +0000 UTC m=+4.711370383,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.589575 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d63e345ed897\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e345ed897 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.048526999 +0000 UTC m=+3.567088848,LastTimestamp:2026-02-26 15:42:22.200739343 +0000 UTC m=+4.719301192,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.594079 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e84150559 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.385866073 +0000 UTC m=+4.904427922,LastTimestamp:2026-02-26 15:42:22.385866073 +0000 UTC m=+4.904427922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.598666 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d63e3e6bcabe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e3e6bcabe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.217147582 +0000 UTC m=+3.735709431,LastTimestamp:2026-02-26 15:42:22.386676144 +0000 UTC m=+4.905238033,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.604139 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e84a08074 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.395007092 +0000 UTC m=+4.913568941,LastTimestamp:2026-02-26 15:42:22.395007092 +0000 UTC m=+4.913568941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.609251 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e84b5d89a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.396405914 +0000 UTC m=+4.914967783,LastTimestamp:2026-02-26 15:42:22.396405914 +0000 UTC m=+4.914967783,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.614549 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d63e3f592580\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d63e3f592580 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:21.232702848 +0000 UTC m=+3.751264727,LastTimestamp:2026-02-26 15:42:22.398736756 +0000 UTC m=+4.917298635,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.617803 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e8fe528b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.584055993 +0000 UTC m=+5.102617842,LastTimestamp:2026-02-26 15:42:22.584055993 +0000 UTC m=+5.102617842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.621835 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e90effc9b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.601542811 +0000 UTC m=+5.120104660,LastTimestamp:2026-02-26 15:42:22.601542811 +0000 UTC m=+5.120104660,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.624909 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e91048ceb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.602890475 +0000 UTC m=+5.121452324,LastTimestamp:2026-02-26 15:42:22.602890475 +0000 UTC m=+5.121452324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.628864 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e9d8bd538 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.813082936 +0000 UTC m=+5.331644785,LastTimestamp:2026-02-26 15:42:22.813082936 +0000 UTC m=+5.331644785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.632358 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e9e4bd999 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.825666969 +0000 UTC m=+5.344228818,LastTimestamp:2026-02-26 15:42:22.825666969 +0000 UTC m=+5.344228818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.636204 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63e9e5fcbaa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:22.826974122 +0000 UTC m=+5.345535971,LastTimestamp:2026-02-26 15:42:22.826974122 +0000 UTC m=+5.345535971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.641065 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63eaae8d852 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:23.037282386 +0000 UTC m=+5.555844255,LastTimestamp:2026-02-26 15:42:23.037282386 +0000 UTC m=+5.555844255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.644072 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63eaba1cefd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:23.049404157 +0000 UTC m=+5.567966026,LastTimestamp:2026-02-26 15:42:23.049404157 +0000 UTC m=+5.567966026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.647560 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63eabb74f31 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:23.050813233 +0000 UTC m=+5.569375092,LastTimestamp:2026-02-26 15:42:23.050813233 +0000 UTC m=+5.569375092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.651090 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63eb7985d82 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:23.250111874 +0000 UTC m=+5.768673773,LastTimestamp:2026-02-26 15:42:23.250111874 +0000 UTC m=+5.768673773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.656103 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897d63eb84089d3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:23.261133267 +0000 UTC m=+5.779695116,LastTimestamp:2026-02-26 15:42:23.261133267 +0000 UTC m=+5.779695116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.660483 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d64084d6cd95 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 26 15:43:12 crc kubenswrapper[4907]: body: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:30.988500373 +0000 UTC m=+13.507062252,LastTimestamp:2026-02-26 15:42:30.988500373 +0000 UTC m=+13.507062252,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.663753 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d64084d7fa41 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:30.988577345 +0000 UTC m=+13.507139234,LastTimestamp:2026-02-26 15:42:30.988577345 +0000 UTC m=+13.507139234,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.668115 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-apiserver-crc.1897d640d11f605e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 15:43:12 crc kubenswrapper[4907]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:43:12 crc kubenswrapper[4907]: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:32.268324958 +0000 UTC m=+14.786886817,LastTimestamp:2026-02-26 15:42:32.268324958 +0000 UTC m=+14.786886817,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.671868 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d640d11ffc77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:32.268364919 +0000 UTC m=+14.786926778,LastTimestamp:2026-02-26 15:42:32.268364919 +0000 UTC m=+14.786926778,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.674924 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d640d11f605e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-apiserver-crc.1897d640d11f605e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 15:43:12 crc kubenswrapper[4907]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 15:43:12 crc kubenswrapper[4907]: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:32.268324958 +0000 UTC m=+14.786886817,LastTimestamp:2026-02-26 15:42:32.275968604 +0000 UTC m=+14.794530463,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.680371 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897d640d11ffc77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897d640d11ffc77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:32.268364919 +0000 UTC m=+14.786926778,LastTimestamp:2026-02-26 15:42:32.276009125 +0000 UTC m=+14.794570984,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.684663 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d642d8e13dd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:43:12 crc kubenswrapper[4907]: body: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:40.988405207 +0000 UTC m=+23.506967086,LastTimestamp:2026-02-26 15:42:40.988405207 +0000 UTC m=+23.506967086,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.688740 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d642d8e2315c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:40.988467548 +0000 UTC m=+23.507029427,LastTimestamp:2026-02-26 15:42:40.988467548 +0000 UTC m=+23.507029427,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.696038 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d645212a8645 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:44102->192.168.126.11:10357: read: connection reset by peer Feb 26 15:43:12 crc kubenswrapper[4907]: body: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:50.791102021 +0000 UTC m=+33.309663910,LastTimestamp:2026-02-26 15:42:50.791102021 +0000 UTC m=+33.309663910,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.699714 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d645212bed6d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:44102->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:50.791193965 +0000 UTC m=+33.309755844,LastTimestamp:2026-02-26 15:42:50.791193965 +0000 UTC m=+33.309755844,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.706232 4907 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d645215169e1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:50.793650657 +0000 UTC m=+33.312212506,LastTimestamp:2026-02-26 15:42:50.793650657 +0000 UTC m=+33.312212506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.709689 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d63dcbc03b90\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63dcbc03b90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.293301648 +0000 UTC m=+1.811863517,LastTimestamp:2026-02-26 15:42:51.313975352 +0000 UTC m=+33.832537201,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.715716 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d63ddf823c23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63ddf823c23 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.624782883 +0000 UTC m=+2.143344742,LastTimestamp:2026-02-26 15:42:51.512284128 +0000 UTC m=+34.030845977,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.719705 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d63de015c9b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d63de015c9b6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:19.634452918 +0000 UTC m=+2.153014757,LastTimestamp:2026-02-26 15:42:51.523416773 +0000 UTC m=+34.041978622,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.727245 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d642d8e13dd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d642d8e13dd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:43:12 crc kubenswrapper[4907]: body: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:40.988405207 +0000 UTC m=+23.506967086,LastTimestamp:2026-02-26 15:43:00.989771889 +0000 UTC m=+43.508333778,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.734413 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d642d8e2315c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897d642d8e2315c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:40.988467548 +0000 UTC m=+23.507029427,LastTimestamp:2026-02-26 15:43:00.989833452 +0000 UTC m=+43.508395341,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:43:12 crc kubenswrapper[4907]: E0226 15:43:12.739422 4907 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897d642d8e13dd7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 15:43:12 crc kubenswrapper[4907]: &Event{ObjectMeta:{kube-controller-manager-crc.1897d642d8e13dd7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 15:43:12 crc kubenswrapper[4907]: body: Feb 26 15:43:12 crc kubenswrapper[4907]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:42:40.988405207 +0000 UTC m=+23.506967086,LastTimestamp:2026-02-26 15:43:10.990089822 +0000 UTC m=+53.508651682,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 15:43:12 crc kubenswrapper[4907]: > Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.074814 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.126081 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.127384 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.127479 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.127506 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.128680 4907 scope.go:117] "RemoveContainer" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" Feb 26 15:43:13 crc kubenswrapper[4907]: E0226 15:43:13.129091 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:13 crc kubenswrapper[4907]: E0226 15:43:13.689141 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.700193 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.702013 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.702073 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.702085 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:13 crc kubenswrapper[4907]: I0226 15:43:13.702120 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:43:13 crc kubenswrapper[4907]: E0226 15:43:13.709450 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:43:14 crc kubenswrapper[4907]: I0226 15:43:14.076997 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:14 crc kubenswrapper[4907]: I0226 15:43:14.644076 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 15:43:14 crc kubenswrapper[4907]: I0226 15:43:14.644514 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:14 crc kubenswrapper[4907]: I0226 15:43:14.645489 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:14 crc kubenswrapper[4907]: I0226 15:43:14.645510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:14 crc kubenswrapper[4907]: I0226 15:43:14.645518 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:15 crc kubenswrapper[4907]: I0226 15:43:15.072983 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:16 crc kubenswrapper[4907]: I0226 15:43:16.073227 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:17 crc kubenswrapper[4907]: I0226 15:43:17.074695 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:17 crc kubenswrapper[4907]: I0226 15:43:17.996212 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:43:17 crc kubenswrapper[4907]: I0226 15:43:17.996480 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:17 crc kubenswrapper[4907]: I0226 15:43:17.998162 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:17 crc kubenswrapper[4907]: I0226 15:43:17.998213 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:17 crc kubenswrapper[4907]: I0226 15:43:17.998229 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:18 crc kubenswrapper[4907]: I0226 15:43:18.004107 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:43:18 crc kubenswrapper[4907]: I0226 15:43:18.075504 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:18 crc kubenswrapper[4907]: E0226 15:43:18.200296 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:43:18 crc kubenswrapper[4907]: I0226 15:43:18.405501 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:18 crc kubenswrapper[4907]: I0226 15:43:18.406308 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:18 crc kubenswrapper[4907]: I0226 15:43:18.406428 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:18 crc kubenswrapper[4907]: I0226 15:43:18.406519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:19 crc kubenswrapper[4907]: I0226 15:43:19.074261 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:20 crc kubenswrapper[4907]: I0226 15:43:20.075843 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:20 crc kubenswrapper[4907]: E0226 15:43:20.691264 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:43:20 crc kubenswrapper[4907]: I0226 15:43:20.710296 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:20 crc kubenswrapper[4907]: I0226 15:43:20.711952 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:20 crc kubenswrapper[4907]: I0226 15:43:20.711987 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:20 crc kubenswrapper[4907]: I0226 15:43:20.712000 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:20 crc kubenswrapper[4907]: I0226 15:43:20.712046 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:43:20 crc kubenswrapper[4907]: E0226 15:43:20.716507 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:43:21 crc kubenswrapper[4907]: I0226 15:43:21.076668 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:22 crc kubenswrapper[4907]: I0226 15:43:22.074368 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:23 crc kubenswrapper[4907]: I0226 15:43:23.073944 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:24 crc kubenswrapper[4907]: I0226 15:43:24.075930 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:25 crc kubenswrapper[4907]: I0226 15:43:25.074555 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.075791 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.126151 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.127640 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.127825 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.127896 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.128751 4907 scope.go:117] "RemoveContainer" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.425988 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.429191 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3"} Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.429397 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.430758 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.430799 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:26 crc kubenswrapper[4907]: I0226 15:43:26.430816 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.073453 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.155232 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.432400 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.432947 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.434540 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" exitCode=255 Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.434580 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3"} Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.434645 4907 scope.go:117] "RemoveContainer" containerID="8b89ef0ed86ca2976c210fdcdc9f594c5ba8738929e96fe41a2a26be4979caae" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.434674 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.435500 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.435534 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.435545 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.436122 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:43:27 crc kubenswrapper[4907]: E0226 15:43:27.436326 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:27 crc kubenswrapper[4907]: E0226 15:43:27.698036 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.717151 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.718446 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.718471 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.718480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:27 crc kubenswrapper[4907]: I0226 15:43:27.718499 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:43:27 crc kubenswrapper[4907]: E0226 15:43:27.723139 4907 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.073038 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:28 crc kubenswrapper[4907]: E0226 15:43:28.200442 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.438240 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.441450 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.442372 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.442431 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.442449 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.443280 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:43:28 crc kubenswrapper[4907]: E0226 15:43:28.443546 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.928508 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 26 15:43:28 crc kubenswrapper[4907]: I0226 15:43:28.947109 4907 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 15:43:29 crc kubenswrapper[4907]: I0226 15:43:29.076657 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.077417 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.647384 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.647531 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.648667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.648714 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.648728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:30 crc kubenswrapper[4907]: I0226 15:43:30.649292 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:43:30 crc kubenswrapper[4907]: E0226 15:43:30.649481 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:31 crc kubenswrapper[4907]: I0226 15:43:31.074683 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:32 crc kubenswrapper[4907]: I0226 15:43:32.076462 4907 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 15:43:32 crc kubenswrapper[4907]: W0226 15:43:32.247363 4907 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 26 15:43:32 crc kubenswrapper[4907]: E0226 15:43:32.247417 4907 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 15:43:33 crc kubenswrapper[4907]: I0226 15:43:33.033458 4907 csr.go:261] certificate signing request csr-kvssp is approved, waiting to be issued Feb 26 15:43:33 crc kubenswrapper[4907]: I0226 15:43:33.048019 4907 csr.go:257] certificate signing request csr-kvssp is issued Feb 26 15:43:33 crc kubenswrapper[4907]: I0226 15:43:33.149929 4907 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 26 15:43:33 crc kubenswrapper[4907]: I0226 15:43:33.929934 4907 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.050268 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-31 00:38:40.319983184 +0000 UTC Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.050318 4907 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7376h55m6.269669864s for next certificate rotation Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.723836 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.725252 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.725297 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.725307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.725466 4907 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.735094 4907 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.735413 4907 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.735438 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.739523 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.739559 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.739571 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.739609 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.739624 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:34Z","lastTransitionTime":"2026-02-26T15:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.755431 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.765869 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.765917 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.765932 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.765950 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.765967 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:34Z","lastTransitionTime":"2026-02-26T15:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.782340 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.792654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.792815 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.792904 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.792986 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.793146 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:34Z","lastTransitionTime":"2026-02-26T15:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.807975 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.817270 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.817325 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.817343 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.817365 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:34 crc kubenswrapper[4907]: I0226 15:43:34.817383 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:34Z","lastTransitionTime":"2026-02-26T15:43:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.830240 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.830754 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.830912 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:34 crc kubenswrapper[4907]: E0226 15:43:34.931914 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.033027 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.133170 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.234319 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.334849 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.435784 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.536546 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.637705 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.738072 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.839247 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:35 crc kubenswrapper[4907]: E0226 15:43:35.940314 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.041088 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.141619 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.242030 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.342323 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.443342 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.544404 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.644791 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.745684 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: I0226 15:43:36.782055 4907 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.846171 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:36 crc kubenswrapper[4907]: E0226 15:43:36.947253 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.048561 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.150284 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.251221 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.352362 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.452786 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.553326 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.653696 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.754739 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.855044 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:37 crc kubenswrapper[4907]: E0226 15:43:37.956269 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.056705 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.157623 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.201789 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.258624 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.359682 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.459849 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.560582 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.661686 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.762288 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.862818 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:38 crc kubenswrapper[4907]: E0226 15:43:38.963126 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.064155 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.164964 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.265982 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.367036 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.467521 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.568307 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.669304 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.770140 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.870956 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:39 crc kubenswrapper[4907]: E0226 15:43:39.971765 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.072525 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.172849 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.273736 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.374354 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.474735 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.575610 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.676690 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.777277 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.878195 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:40 crc kubenswrapper[4907]: E0226 15:43:40.978565 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.078775 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.179448 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.280222 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.381183 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.481889 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.583205 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.684302 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.785021 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.886026 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:41 crc kubenswrapper[4907]: E0226 15:43:41.986402 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.087074 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: I0226 15:43:42.126051 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:42 crc kubenswrapper[4907]: I0226 15:43:42.127411 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:42 crc kubenswrapper[4907]: I0226 15:43:42.127440 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:42 crc kubenswrapper[4907]: I0226 15:43:42.127450 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.187429 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: I0226 15:43:42.230538 4907 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.287847 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.388483 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.489575 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.590011 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.690967 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.792362 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.892947 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:42 crc kubenswrapper[4907]: E0226 15:43:42.993209 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.093690 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: I0226 15:43:43.126616 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:43 crc kubenswrapper[4907]: I0226 15:43:43.127623 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:43 crc kubenswrapper[4907]: I0226 15:43:43.127647 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:43 crc kubenswrapper[4907]: I0226 15:43:43.127659 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:43 crc kubenswrapper[4907]: I0226 15:43:43.128215 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.128397 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.193844 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.294624 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.395311 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.495864 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.597040 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.698134 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.798516 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.899543 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:43 crc kubenswrapper[4907]: E0226 15:43:43.999772 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.100842 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.201689 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.301889 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.402407 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.503186 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.603903 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.704285 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.804848 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.906309 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.970533 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.974865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.975047 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.975176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.975295 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.975418 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:44Z","lastTransitionTime":"2026-02-26T15:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:44 crc kubenswrapper[4907]: E0226 15:43:44.987962 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.993169 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.993198 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.993209 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.993224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:44 crc kubenswrapper[4907]: I0226 15:43:44.993235 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:44Z","lastTransitionTime":"2026-02-26T15:43:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.003471 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.007174 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.007200 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.007210 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.007220 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.007229 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:45Z","lastTransitionTime":"2026-02-26T15:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.017041 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.020893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.020919 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.020927 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.020940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.020949 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:45Z","lastTransitionTime":"2026-02-26T15:43:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.032463 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.032849 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.032900 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.133201 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.233890 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: I0226 15:43:45.246888 4907 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.334651 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.435091 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.535966 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.636988 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.737945 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.838450 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:45 crc kubenswrapper[4907]: E0226 15:43:45.939203 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.040298 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.140814 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.241539 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.342133 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.442637 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.543177 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.644121 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.745248 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.846266 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:46 crc kubenswrapper[4907]: E0226 15:43:46.947238 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.048235 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.148377 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.248633 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.349136 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.449768 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.550224 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.651253 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.752205 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.853064 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:47 crc kubenswrapper[4907]: E0226 15:43:47.953287 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.053944 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.154955 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.202534 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.255575 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.356066 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.456522 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.557143 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.657268 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.758270 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.859189 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:48 crc kubenswrapper[4907]: E0226 15:43:48.960023 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.060969 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.162044 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.262829 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.363559 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.464446 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.565353 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.665907 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.766688 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.867517 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:49 crc kubenswrapper[4907]: E0226 15:43:49.967759 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.068734 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.169459 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.270259 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.370476 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.471301 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.572197 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.673390 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.774408 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.875223 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:50 crc kubenswrapper[4907]: E0226 15:43:50.975290 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.076195 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.176994 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.277929 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.378543 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.479402 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.580493 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.680680 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.780787 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.881402 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:51 crc kubenswrapper[4907]: E0226 15:43:51.982349 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.083702 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.184089 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.284378 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.385503 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.486625 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.587749 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.688882 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.789873 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.890852 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:52 crc kubenswrapper[4907]: E0226 15:43:52.991395 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.091994 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.193001 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.294133 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.394498 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.495898 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.596702 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.697666 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.798826 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:53 crc kubenswrapper[4907]: E0226 15:43:53.899962 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.001005 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.101387 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.201730 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.302665 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.403144 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.503257 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.603911 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.704765 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.805364 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:54 crc kubenswrapper[4907]: E0226 15:43:54.906374 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.007050 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.107823 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.208311 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.308448 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.326696 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.334753 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.334814 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.334849 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.334865 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.334876 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:55Z","lastTransitionTime":"2026-02-26T15:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.351415 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.354708 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.354746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.354758 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.354775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.354788 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:55Z","lastTransitionTime":"2026-02-26T15:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.367903 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.370969 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.371029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.371049 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.371076 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.371096 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:55Z","lastTransitionTime":"2026-02-26T15:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.385657 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.389667 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.389701 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.389709 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.389723 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:43:55 crc kubenswrapper[4907]: I0226 15:43:55.389731 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:43:55Z","lastTransitionTime":"2026-02-26T15:43:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.402398 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.402542 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.409512 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.510499 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.610676 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.711044 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.811883 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:55 crc kubenswrapper[4907]: E0226 15:43:55.912953 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.013488 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.114170 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: I0226 15:43:56.126134 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:43:56 crc kubenswrapper[4907]: I0226 15:43:56.127635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:43:56 crc kubenswrapper[4907]: I0226 15:43:56.127705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:43:56 crc kubenswrapper[4907]: I0226 15:43:56.127732 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:43:56 crc kubenswrapper[4907]: I0226 15:43:56.128808 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.129189 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.215328 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.315568 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.416245 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.517229 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.618338 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.719412 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.820289 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:56 crc kubenswrapper[4907]: E0226 15:43:56.921478 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.021971 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.122913 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.223011 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.323122 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.423819 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.523934 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.624530 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.724970 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.825343 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:57 crc kubenswrapper[4907]: E0226 15:43:57.925680 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.026449 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.126836 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.202699 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.227096 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.327911 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.429076 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.529989 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.630456 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.731698 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.832692 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:58 crc kubenswrapper[4907]: E0226 15:43:58.933782 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.034159 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.134829 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.235898 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.336752 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.437845 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.538893 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.639908 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.740955 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.841858 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:43:59 crc kubenswrapper[4907]: E0226 15:43:59.942996 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.043671 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.144687 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.245873 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.346643 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.447423 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.548420 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.649832 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.750686 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.851051 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:00 crc kubenswrapper[4907]: E0226 15:44:00.951923 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.052227 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: I0226 15:44:01.126707 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:44:01 crc kubenswrapper[4907]: I0226 15:44:01.128041 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:01 crc kubenswrapper[4907]: I0226 15:44:01.128140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:01 crc kubenswrapper[4907]: I0226 15:44:01.128160 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.153037 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.253448 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.354499 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.455419 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.556299 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.657704 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.758445 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.858952 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:01 crc kubenswrapper[4907]: E0226 15:44:01.959748 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.060897 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.161679 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.262448 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.362695 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.463631 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.564094 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.665568 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.766494 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.867374 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:02 crc kubenswrapper[4907]: E0226 15:44:02.968184 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.069176 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.169654 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.270452 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.371009 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.472009 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.572665 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.673422 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.774161 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.874704 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:03 crc kubenswrapper[4907]: E0226 15:44:03.975861 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.076622 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.177786 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.277974 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.378788 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.479395 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.579508 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.680169 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.781544 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.882122 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:04 crc kubenswrapper[4907]: E0226 15:44:04.983219 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.083944 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.185644 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.286795 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.387237 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.488133 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.534943 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.541296 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.541360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.541382 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.541410 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.541432 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:05Z","lastTransitionTime":"2026-02-26T15:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.561380 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.565517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.565576 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.565635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.565672 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.565695 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:05Z","lastTransitionTime":"2026-02-26T15:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.577489 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.582294 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.582345 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.582363 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.582420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.582443 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:05Z","lastTransitionTime":"2026-02-26T15:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.598251 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.602616 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.602649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.602660 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.602678 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:05 crc kubenswrapper[4907]: I0226 15:44:05.602703 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:05Z","lastTransitionTime":"2026-02-26T15:44:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.617155 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.617345 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.617374 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.718422 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.819065 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:05 crc kubenswrapper[4907]: E0226 15:44:05.919859 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.020128 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.120223 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.220668 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.321602 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.422476 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.522982 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.623400 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.724462 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.825576 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:06 crc kubenswrapper[4907]: E0226 15:44:06.926566 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.027361 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.127662 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.227792 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.328182 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.429005 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.529102 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.630137 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.730977 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.831689 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:07 crc kubenswrapper[4907]: E0226 15:44:07.932193 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.032666 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.133319 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.203516 4907 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.234475 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.335685 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.436657 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.537827 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.639005 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.739895 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.841095 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:08 crc kubenswrapper[4907]: E0226 15:44:08.941791 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.042720 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.142864 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.243224 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.344323 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.445468 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.545569 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.646798 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.747733 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.848708 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:09 crc kubenswrapper[4907]: E0226 15:44:09.949500 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.049888 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.150563 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.251788 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.351960 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.452384 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.552740 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.652847 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.753476 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.854317 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:10 crc kubenswrapper[4907]: E0226 15:44:10.955346 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.056501 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.125950 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.127456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.127509 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.127526 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.128556 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.157211 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.257382 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.357627 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.458282 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.552968 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.554677 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85"} Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.554814 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.555674 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.555777 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:11 crc kubenswrapper[4907]: I0226 15:44:11.555842 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.558861 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.659203 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.760128 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.861238 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:11 crc kubenswrapper[4907]: E0226 15:44:11.962819 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.063908 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.165459 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.266023 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.366529 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.467722 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.558705 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.559417 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.562198 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" exitCode=255 Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.562252 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85"} Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.562300 4907 scope.go:117] "RemoveContainer" containerID="bede3ef68147485c0c4f3e38fc427d20f05c7814899efa423206af42b4509bc3" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.562509 4907 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.563706 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.563774 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.563787 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:12 crc kubenswrapper[4907]: I0226 15:44:12.564679 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.564912 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.568488 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.668659 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.769475 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.870496 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:12 crc kubenswrapper[4907]: E0226 15:44:12.970977 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.072092 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.172860 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.273149 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.373457 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.474456 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: I0226 15:44:13.565898 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.575133 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.676015 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.776647 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.877013 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:13 crc kubenswrapper[4907]: E0226 15:44:13.977767 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.078802 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.179629 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.280619 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.381435 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.481558 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.582008 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.682630 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.783686 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.884500 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:14 crc kubenswrapper[4907]: E0226 15:44:14.985542 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.086101 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.186407 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.286824 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.387402 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.488216 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.589264 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.689530 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.790447 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.867874 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.873686 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.873743 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.873762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.873786 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.873804 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:15Z","lastTransitionTime":"2026-02-26T15:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.891556 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.898266 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.898307 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.898320 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.898336 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.898347 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:15Z","lastTransitionTime":"2026-02-26T15:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.909949 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.914493 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.914549 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.914562 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.914581 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.914617 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:15Z","lastTransitionTime":"2026-02-26T15:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.930211 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.934994 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.935054 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.935075 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.935102 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:15 crc kubenswrapper[4907]: I0226 15:44:15.935124 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:15Z","lastTransitionTime":"2026-02-26T15:44:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.948579 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.948918 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:44:15 crc kubenswrapper[4907]: E0226 15:44:15.948960 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.049343 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.149506 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.250580 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.351134 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.451962 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.552318 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.653170 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.753967 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.854837 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:16 crc kubenswrapper[4907]: E0226 15:44:16.955904 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:17 crc kubenswrapper[4907]: E0226 15:44:17.056880 4907 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.135881 4907 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.155175 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.163380 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.163420 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.163436 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.163456 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.163471 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.175700 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:17 crc kubenswrapper[4907]: E0226 15:44:17.176190 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.266114 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.266158 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.266168 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.266185 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.266198 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.368899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.368959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.368970 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.368988 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.369001 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.471720 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.471779 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.471797 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.471820 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.471837 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.574607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.574648 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.574660 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.574676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.574688 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.677975 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.678038 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.678055 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.678077 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.678096 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.780427 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.780459 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.780468 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.780480 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.780488 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.883392 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.883495 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.883517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.883577 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.883646 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.986887 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.986929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.986940 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.986956 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:17 crc kubenswrapper[4907]: I0226 15:44:17.986965 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:17Z","lastTransitionTime":"2026-02-26T15:44:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.087892 4907 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.117081 4907 apiserver.go:52] "Watching apiserver" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.124164 4907 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.124795 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-kube-apiserver/kube-apiserver-crc","openshift-multus/multus-2gl5t","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-vsvsw","openshift-dns/node-resolver-958vt","openshift-image-registry/node-ca-9gtgp","openshift-machine-config-operator/machine-config-daemon-v5ng6","openshift-multus/multus-additional-cni-plugins-b2qgz"] Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.125199 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.125867 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.125894 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.125955 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.126053 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.126649 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.126790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.126907 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.127036 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.127142 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.128804 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.129667 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.130119 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.130430 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131033 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131062 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131132 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131279 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131033 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131279 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131667 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131768 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.131839 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.132015 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.131654 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.132780 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.132991 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.132997 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133089 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133170 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133280 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133357 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133372 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133364 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133454 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.133735 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.135123 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.135251 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.135343 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.135446 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.135531 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.137699 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.139116 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.139183 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.140711 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.140975 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.141172 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.141380 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.141619 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.141798 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.141950 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.142689 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.153561 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.165752 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.181332 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.182498 4907 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.197288 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.206338 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.217098 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.219139 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.232777 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.242903 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.251426 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.260722 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269443 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269497 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269554 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269850 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269898 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269985 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.269582 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270072 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270201 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270380 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270444 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270451 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270513 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270540 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270613 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270720 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270737 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270789 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270812 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270836 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270859 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270883 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270905 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270928 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270955 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.270980 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271002 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271028 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271050 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271073 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271127 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271149 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271173 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271195 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271217 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271240 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271370 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271395 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271418 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271447 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271696 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271730 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271760 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271783 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271809 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271833 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271855 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271929 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271954 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271979 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272003 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272022 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272042 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272089 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272132 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272153 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272284 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272311 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272336 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272360 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272385 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272406 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272432 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272457 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272482 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272507 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272529 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272620 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272646 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272673 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272696 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272721 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272768 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271098 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272816 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271342 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271456 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272842 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272868 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272894 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272918 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273012 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273044 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273065 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273122 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273151 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273232 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273258 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273309 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273361 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273389 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273412 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273469 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273497 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273525 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273578 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273633 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273659 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273688 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273714 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273738 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273761 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273788 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273810 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273838 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273861 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273889 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273915 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273937 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273969 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273993 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274016 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274044 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274080 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274105 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271531 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.271837 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272113 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272416 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272440 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272474 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272743 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272803 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.272945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273092 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279515 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273255 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273683 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273700 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273859 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.273922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274225 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.274309 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:44:18.774290468 +0000 UTC m=+121.292852377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274441 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274494 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274506 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274789 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.274996 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.275142 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.275497 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.275638 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.275659 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.275784 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276137 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276249 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276465 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276497 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276559 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276782 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276706 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276795 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276927 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276919 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.276986 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277030 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277014 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277403 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277397 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277513 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277555 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277766 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277852 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277897 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278046 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278110 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278199 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280003 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278292 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278672 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278682 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278700 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278828 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.278996 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279233 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280239 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280247 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280414 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280466 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279239 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279392 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279505 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.277228 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.279922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280549 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.280575 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.281087 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.281322 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.281492 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.281655 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.281801 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282222 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282381 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282417 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282440 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282486 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282511 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282544 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282568 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282608 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282684 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282709 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282729 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282755 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282778 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282782 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282799 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282780 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282825 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282848 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282870 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282891 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282914 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282937 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282939 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.282997 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283043 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283089 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283110 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283131 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283152 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283173 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283194 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283216 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283240 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283261 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283283 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283309 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283333 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283379 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.284513 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.290102 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.290513 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.290964 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291198 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291295 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291647 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283257 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292396 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283519 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292524 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.284088 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.284267 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.284619 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.285163 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.285573 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.285627 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.286339 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.286461 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.286534 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.286872 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.286888 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.286929 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.287201 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.287846 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292732 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.288400 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.288431 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.288510 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.288937 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292784 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.289155 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.289482 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.290465 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.290583 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.290979 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291241 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292879 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291335 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291422 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291582 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291619 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291861 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291977 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292098 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.291943 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.292370 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.283781 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.293047 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.293208 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.293295 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.293660 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.293507 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.293802 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294095 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294122 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294209 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294259 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294874 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.295111 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.295358 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.295541 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.294889 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.295858 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.295985 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.296320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.296660 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297040 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297144 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297225 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297473 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297599 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297640 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297665 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297691 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297714 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297734 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297758 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297779 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297801 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297823 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297871 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297896 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297918 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297942 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297964 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297983 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298002 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298021 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.296253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.296318 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.296975 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297772 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.297928 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298388 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298402 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298749 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298804 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298810 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298902 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299061 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299108 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299095 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299211 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.298091 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299363 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299386 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299431 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299886 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.299447 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-host\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300221 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300250 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300456 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300710 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300728 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300768 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300822 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl77m\" (UniqueName: \"kubernetes.io/projected/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-kube-api-access-xl77m\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-kubelet\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300888 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-systemd\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-etc-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-hostroot\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.300984 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301138 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301195 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-config\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301242 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-env-overrides\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301295 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301346 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301411 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301457 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-log-socket\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301480 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301502 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-script-lib\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301542 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-cnibin\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-os-release\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301611 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-ovn\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301633 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-bin\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301656 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-daemon-config\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301676 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-cni-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301696 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301755 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4569fec7-a859-4a9e-b9d9-34ccc7c6be9c-hosts-file\") pod \"node-resolver-958vt\" (UID: \"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\") " pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301780 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhj9\" (UniqueName: \"kubernetes.io/projected/4569fec7-a859-4a9e-b9d9-34ccc7c6be9c-kube-api-access-8nhj9\") pod \"node-resolver-958vt\" (UID: \"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\") " pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301801 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-system-cni-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-cni-multus\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301852 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-conf-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-serviceca\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fx5n\" (UniqueName: \"kubernetes.io/projected/51024bd5-00ff-4e2f-927c-8c989b59d7be-kube-api-access-2fx5n\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301920 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301962 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.301983 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-os-release\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.302003 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovn-node-metrics-cert\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.302925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/917eebf3-db36-47b8-af0a-b80d042fddab-proxy-tls\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.302956 4907 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303183 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-systemd-units\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-var-lib-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303323 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-netns\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303345 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-cni-bin\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-multus-certs\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303431 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/917eebf3-db36-47b8-af0a-b80d042fddab-mcd-auth-proxy-config\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cnibin\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303721 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-netd\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303833 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303942 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-socket-dir-parent\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303964 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-kubelet\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.303989 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmpq\" (UniqueName: \"kubernetes.io/projected/917eebf3-db36-47b8-af0a-b80d042fddab-kube-api-access-9lmpq\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304010 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hmb7\" (UniqueName: \"kubernetes.io/projected/49ee65e1-8667-4ad7-a403-c899f0cc6a70-kube-api-access-7hmb7\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304055 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51024bd5-00ff-4e2f-927c-8c989b59d7be-cni-binary-copy\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/917eebf3-db36-47b8-af0a-b80d042fddab-rootfs\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304096 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-slash\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304247 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-k8s-cni-cncf-io\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304342 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304441 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.304687 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.305249 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307700 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-netns\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307746 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vfj6\" (UniqueName: \"kubernetes.io/projected/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-kube-api-access-2vfj6\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307769 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-node-log\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307822 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-etc-kubernetes\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-system-cni-dir\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.307896 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.308397 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.308473 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:18.808453608 +0000 UTC m=+121.327015467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.308555 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.308820 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:18.808807888 +0000 UTC m=+121.327369757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.308985 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309017 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309039 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309060 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309079 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309094 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309107 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309120 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309132 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309146 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309159 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309172 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309184 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309200 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309217 4907 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309304 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309516 4907 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309532 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309545 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309558 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309571 4907 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309583 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309621 4907 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309633 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309646 4907 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309658 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309670 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309682 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309696 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309709 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309721 4907 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309734 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309747 4907 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309759 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311140 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311384 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.309772 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311868 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311882 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311897 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311912 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311925 4907 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311940 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311954 4907 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311967 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311979 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.311992 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312004 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312016 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312029 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312041 4907 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312055 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312067 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312080 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312093 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312105 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312119 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312132 4907 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312167 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312180 4907 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312192 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312205 4907 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312218 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312231 4907 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312243 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312255 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312269 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312282 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312294 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312306 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312319 4907 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312333 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312345 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312358 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312371 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312384 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312396 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312408 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312422 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312436 4907 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312449 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312462 4907 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312474 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312487 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312500 4907 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312513 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312525 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312537 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312549 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312561 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312573 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312586 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312630 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312642 4907 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312655 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312667 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312681 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312693 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312706 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312719 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312732 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312745 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312757 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312769 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312782 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312795 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312809 4907 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312821 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312833 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312845 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312857 4907 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312869 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312881 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312892 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312905 4907 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312917 4907 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312930 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312942 4907 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312954 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312967 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312979 4907 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.312991 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313003 4907 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313015 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313029 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313045 4907 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313057 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313070 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313083 4907 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313096 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313107 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313121 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313133 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313145 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313158 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313170 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313182 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313195 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313207 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313219 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313232 4907 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313243 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313255 4907 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313268 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313280 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313292 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313304 4907 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313316 4907 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313330 4907 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313342 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313356 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313368 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313380 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313392 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313404 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313416 4907 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313428 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313440 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313453 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313465 4907 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313477 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313490 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313503 4907 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313515 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313527 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313539 4907 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313552 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313564 4907 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313575 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313627 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.313640 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.315851 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.316397 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.316760 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.317664 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.317691 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.317704 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.317766 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:18.817748044 +0000 UTC m=+121.336309963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.317841 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.318282 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.319355 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.319940 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.320228 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.320791 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.320864 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.320887 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.321125 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:18.821107966 +0000 UTC m=+121.339669825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.321416 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.321480 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.321976 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.322042 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.328206 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.328540 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.328637 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.328924 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.329047 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.329635 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.329864 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.333065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.334066 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.336106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.336444 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.336783 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.342625 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.345813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.346770 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.348005 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.353942 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.354453 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.354794 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.361874 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.364511 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.375230 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.384399 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.391400 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.401334 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-cni-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414079 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-daemon-config\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414112 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-system-cni-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414143 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-cni-multus\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-conf-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414203 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-serviceca\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414234 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414262 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4569fec7-a859-4a9e-b9d9-34ccc7c6be9c-hosts-file\") pod \"node-resolver-958vt\" (UID: \"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\") " pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414292 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhj9\" (UniqueName: \"kubernetes.io/projected/4569fec7-a859-4a9e-b9d9-34ccc7c6be9c-kube-api-access-8nhj9\") pod \"node-resolver-958vt\" (UID: \"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\") " pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fx5n\" (UniqueName: \"kubernetes.io/projected/51024bd5-00ff-4e2f-927c-8c989b59d7be-kube-api-access-2fx5n\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414350 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-os-release\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414379 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414411 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovn-node-metrics-cert\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-cni-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414445 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/917eebf3-db36-47b8-af0a-b80d042fddab-proxy-tls\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414512 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-cni-bin\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414537 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-multus-certs\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414563 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/917eebf3-db36-47b8-af0a-b80d042fddab-mcd-auth-proxy-config\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414609 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cnibin\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414640 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-systemd-units\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414663 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-var-lib-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-netns\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414707 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-kubelet\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414711 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-multus-certs\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414731 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmpq\" (UniqueName: \"kubernetes.io/projected/917eebf3-db36-47b8-af0a-b80d042fddab-kube-api-access-9lmpq\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414805 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-netd\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414855 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-socket-dir-parent\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414878 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/917eebf3-db36-47b8-af0a-b80d042fddab-rootfs\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-slash\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414921 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hmb7\" (UniqueName: \"kubernetes.io/projected/49ee65e1-8667-4ad7-a403-c899f0cc6a70-kube-api-access-7hmb7\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414943 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51024bd5-00ff-4e2f-927c-8c989b59d7be-cni-binary-copy\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.414979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-k8s-cni-cncf-io\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415000 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-netns\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415053 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-etc-kubernetes\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-daemon-config\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415104 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-system-cni-dir\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415074 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-system-cni-dir\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-cni-bin\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-netd\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415220 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vfj6\" (UniqueName: \"kubernetes.io/projected/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-kube-api-access-2vfj6\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415247 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-node-log\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415269 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-kubelet\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415294 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-systemd\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415315 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-etc-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415358 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-host\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415380 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl77m\" (UniqueName: \"kubernetes.io/projected/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-kube-api-access-xl77m\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-hostroot\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415434 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415456 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-config\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415480 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-env-overrides\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-cnibin\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415531 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-os-release\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415551 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-log-socket\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415571 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415612 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-script-lib\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-ovn\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415655 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-bin\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415712 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415727 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415742 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415755 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415768 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415782 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415795 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415808 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415822 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415834 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415837 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cnibin\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415847 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/917eebf3-db36-47b8-af0a-b80d042fddab-mcd-auth-proxy-config\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415838 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.415982 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/4569fec7-a859-4a9e-b9d9-34ccc7c6be9c-hosts-file\") pod \"node-resolver-958vt\" (UID: \"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\") " pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416024 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-kubelet\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416051 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-systemd\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416075 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416139 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-systemd-units\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-var-lib-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416089 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-etc-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416197 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416192 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-netns\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-socket-dir-parent\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-kubelet\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416312 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/917eebf3-db36-47b8-af0a-b80d042fddab-rootfs\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-slash\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.416687 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-os-release\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417251 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-env-overrides\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417310 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-cnibin\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-os-release\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417438 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-log-socket\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417467 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-ovn-kubernetes\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417660 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-cni-binary-copy\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417714 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-host\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.417940 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/51024bd5-00ff-4e2f-927c-8c989b59d7be-cni-binary-copy\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418117 4907 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418147 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-var-lib-cni-multus\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418187 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-multus-conf-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-netns\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418217 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-openvswitch\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418237 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-etc-kubernetes\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418272 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-system-cni-dir\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-host-run-k8s-cni-cncf-io\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418314 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-ovn\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418326 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-bin\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418322 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/917eebf3-db36-47b8-af0a-b80d042fddab-proxy-tls\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418368 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/51024bd5-00ff-4e2f-927c-8c989b59d7be-hostroot\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-node-log\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418829 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418861 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418882 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418901 4907 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418921 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418942 4907 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418965 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418983 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.418994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-config\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419002 4907 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419037 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419055 4907 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419070 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419089 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419271 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-serviceca\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.419877 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-script-lib\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.420677 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.422971 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovn-node-metrics-cert\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.433888 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.436489 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fx5n\" (UniqueName: \"kubernetes.io/projected/51024bd5-00ff-4e2f-927c-8c989b59d7be-kube-api-access-2fx5n\") pod \"multus-2gl5t\" (UID: \"51024bd5-00ff-4e2f-927c-8c989b59d7be\") " pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.438818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hmb7\" (UniqueName: \"kubernetes.io/projected/49ee65e1-8667-4ad7-a403-c899f0cc6a70-kube-api-access-7hmb7\") pod \"ovnkube-node-vsvsw\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.440003 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhj9\" (UniqueName: \"kubernetes.io/projected/4569fec7-a859-4a9e-b9d9-34ccc7c6be9c-kube-api-access-8nhj9\") pod \"node-resolver-958vt\" (UID: \"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\") " pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.440163 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vfj6\" (UniqueName: \"kubernetes.io/projected/3ab23cfe-46ea-420e-ba6c-38ac0d2804b0-kube-api-access-2vfj6\") pod \"multus-additional-cni-plugins-b2qgz\" (UID: \"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\") " pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.441897 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmpq\" (UniqueName: \"kubernetes.io/projected/917eebf3-db36-47b8-af0a-b80d042fddab-kube-api-access-9lmpq\") pod \"machine-config-daemon-v5ng6\" (UID: \"917eebf3-db36-47b8-af0a-b80d042fddab\") " pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.443679 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.443766 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.447569 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl77m\" (UniqueName: \"kubernetes.io/projected/ae882fbf-ac76-4363-a10c-60eaf80ee7c7-kube-api-access-xl77m\") pod \"node-ca-9gtgp\" (UID: \"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\") " pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.452925 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-958vt" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.452916 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.459488 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9gtgp" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.462956 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: W0226 15:44:18.463772 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4569fec7_a859_4a9e_b9d9_34ccc7c6be9c.slice/crio-b125cce1e28ab518097194361cb537e6a6a643adf007890e636ac83a65e2f251 WatchSource:0}: Error finding container b125cce1e28ab518097194361cb537e6a6a643adf007890e636ac83a65e2f251: Status 404 returned error can't find the container with id b125cce1e28ab518097194361cb537e6a6a643adf007890e636ac83a65e2f251 Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.468646 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2gl5t" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.473095 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.477368 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 15:44:18 crc kubenswrapper[4907]: W0226 15:44:18.493523 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6ebf3ca9c3e318c1ed9dfa50eeec6d0bf2cb8ccaa33810544a059a66f56da1e3 WatchSource:0}: Error finding container 6ebf3ca9c3e318c1ed9dfa50eeec6d0bf2cb8ccaa33810544a059a66f56da1e3: Status 404 returned error can't find the container with id 6ebf3ca9c3e318c1ed9dfa50eeec6d0bf2cb8ccaa33810544a059a66f56da1e3 Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.494583 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.504890 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.524465 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.529321 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:44:18 crc kubenswrapper[4907]: W0226 15:44:18.529943 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49ee65e1_8667_4ad7_a403_c899f0cc6a70.slice/crio-aaeb89d604bd4111c33ef90df0a3c2bc5e324f383e85eb567b6b409b8ed966d8 WatchSource:0}: Error finding container aaeb89d604bd4111c33ef90df0a3c2bc5e324f383e85eb567b6b409b8ed966d8: Status 404 returned error can't find the container with id aaeb89d604bd4111c33ef90df0a3c2bc5e324f383e85eb567b6b409b8ed966d8 Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.557510 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w"] Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.558027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.561692 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.561939 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 15:44:18 crc kubenswrapper[4907]: W0226 15:44:18.567836 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-5e7c07c254eff1a083d47a9d89f21f3f4fb28e797e1c9d32727b49809abde168 WatchSource:0}: Error finding container 5e7c07c254eff1a083d47a9d89f21f3f4fb28e797e1c9d32727b49809abde168: Status 404 returned error can't find the container with id 5e7c07c254eff1a083d47a9d89f21f3f4fb28e797e1c9d32727b49809abde168 Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.570011 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.580521 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: W0226 15:44:18.581408 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917eebf3_db36_47b8_af0a_b80d042fddab.slice/crio-d18247e73b25be73284d037912eb351a9062270aee198a2f7e9a11d2729b6a95 WatchSource:0}: Error finding container d18247e73b25be73284d037912eb351a9062270aee198a2f7e9a11d2729b6a95: Status 404 returned error can't find the container with id d18247e73b25be73284d037912eb351a9062270aee198a2f7e9a11d2729b6a95 Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.591734 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.600441 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerStarted","Data":"3de81ba81bff53d090e16ee58c919d8e52451f282075eb5ae5476f5e19e0ceba"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.604815 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.605105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerStarted","Data":"359b429174f46e87343101c4e4bee63d5175511f60d650887fa454c55de8ab5d"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.607623 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a8762947e420577891037f341d01f40f03487ff019f3a431c3b3cd45ec1f24ea"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.608993 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9gtgp" event={"ID":"ae882fbf-ac76-4363-a10c-60eaf80ee7c7","Type":"ContainerStarted","Data":"d1139890f3227c8a25f71a4f88e979cce7f8268d56f706ff0f35a8a141efd069"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.610284 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-958vt" event={"ID":"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c","Type":"ContainerStarted","Data":"b125cce1e28ab518097194361cb537e6a6a643adf007890e636ac83a65e2f251"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.611897 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"5e7c07c254eff1a083d47a9d89f21f3f4fb28e797e1c9d32727b49809abde168"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.617444 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"aaeb89d604bd4111c33ef90df0a3c2bc5e324f383e85eb567b6b409b8ed966d8"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.617777 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.620363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrq6z\" (UniqueName: \"kubernetes.io/projected/432281c6-dcf8-4471-9801-9194000a9abd-kube-api-access-wrq6z\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.620401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/432281c6-dcf8-4471-9801-9194000a9abd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.620427 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/432281c6-dcf8-4471-9801-9194000a9abd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.620452 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/432281c6-dcf8-4471-9801-9194000a9abd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.623937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6ebf3ca9c3e318c1ed9dfa50eeec6d0bf2cb8ccaa33810544a059a66f56da1e3"} Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.627865 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.635713 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.645714 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.654399 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.661417 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.670348 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.684126 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.705776 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.721130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq6z\" (UniqueName: \"kubernetes.io/projected/432281c6-dcf8-4471-9801-9194000a9abd-kube-api-access-wrq6z\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.721187 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/432281c6-dcf8-4471-9801-9194000a9abd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.721221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/432281c6-dcf8-4471-9801-9194000a9abd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.721250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/432281c6-dcf8-4471-9801-9194000a9abd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.721713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/432281c6-dcf8-4471-9801-9194000a9abd-env-overrides\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.722176 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/432281c6-dcf8-4471-9801-9194000a9abd-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.725873 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/432281c6-dcf8-4471-9801-9194000a9abd-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.748470 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.773030 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq6z\" (UniqueName: \"kubernetes.io/projected/432281c6-dcf8-4471-9801-9194000a9abd-kube-api-access-wrq6z\") pod \"ovnkube-control-plane-749d76644c-s9f9w\" (UID: \"432281c6-dcf8-4471-9801-9194000a9abd\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.821608 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.821744 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:44:19.821724048 +0000 UTC m=+122.340285897 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.822061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.822096 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.822144 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.822171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822249 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822307 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:19.822289143 +0000 UTC m=+122.340850992 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822383 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822388 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822409 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822416 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:19.822406196 +0000 UTC m=+122.340968115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822418 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822436 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822457 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:19.822449328 +0000 UTC m=+122.341011187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822461 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822473 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: E0226 15:44:18.822523 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:19.8225075 +0000 UTC m=+122.341069349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:18 crc kubenswrapper[4907]: I0226 15:44:18.907039 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.278209 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zsb5l"] Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.278907 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.279011 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.287127 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.296100 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.305402 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.318004 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.326113 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.326186 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbhj9\" (UniqueName: \"kubernetes.io/projected/fd06f422-2c09-4da9-843c-75525df52517-kube-api-access-dbhj9\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.335423 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.349780 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.365024 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.378247 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.391410 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.403872 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.415533 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.426983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbhj9\" (UniqueName: \"kubernetes.io/projected/fd06f422-2c09-4da9-843c-75525df52517-kube-api-access-dbhj9\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.427025 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.427140 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.427182 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:19.927171022 +0000 UTC m=+122.445732871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.435481 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.443292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbhj9\" (UniqueName: \"kubernetes.io/projected/fd06f422-2c09-4da9-843c-75525df52517-kube-api-access-dbhj9\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.454134 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.468651 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.476742 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.631654 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9gtgp" event={"ID":"ae882fbf-ac76-4363-a10c-60eaf80ee7c7","Type":"ContainerStarted","Data":"78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.634372 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" event={"ID":"432281c6-dcf8-4471-9801-9194000a9abd","Type":"ContainerStarted","Data":"00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.634424 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" event={"ID":"432281c6-dcf8-4471-9801-9194000a9abd","Type":"ContainerStarted","Data":"a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.634438 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" event={"ID":"432281c6-dcf8-4471-9801-9194000a9abd","Type":"ContainerStarted","Data":"0a6c91bc5081a0ee7c0c8e0d396431c179501252972794b02e38786898a50650"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.638116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerStarted","Data":"9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.640701 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c" exitCode=0 Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.640804 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.643415 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.643444 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.649014 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.649047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.649060 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"d18247e73b25be73284d037912eb351a9062270aee198a2f7e9a11d2729b6a95"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.651254 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ab23cfe-46ea-420e-ba6c-38ac0d2804b0" containerID="608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d" exitCode=0 Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.651295 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerDied","Data":"608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.654843 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.655189 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-958vt" event={"ID":"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c","Type":"ContainerStarted","Data":"1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.656973 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434"} Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.673425 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.688054 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.709098 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.725624 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.746577 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.767063 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.785357 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.800808 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.810905 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.828448 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.833732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.833819 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:44:21.833801669 +0000 UTC m=+124.352363528 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.833859 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.833888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.833921 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.833947 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.833998 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834007 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834041 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:21.834032275 +0000 UTC m=+124.352594124 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834040 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834053 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:21.834047835 +0000 UTC m=+124.352609684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834059 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834070 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834100 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:21.834089946 +0000 UTC m=+124.352651795 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834176 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834216 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834234 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.834310 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:21.834288232 +0000 UTC m=+124.352850091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.873778 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.906690 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.935081 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.935199 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: E0226 15:44:19.935244 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:20.935229418 +0000 UTC m=+123.453791257 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.949280 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:19 crc kubenswrapper[4907]: I0226 15:44:19.985664 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.031536 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.065412 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.108820 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.125946 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:20 crc kubenswrapper[4907]: E0226 15:44:20.126062 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.126383 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:20 crc kubenswrapper[4907]: E0226 15:44:20.126441 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.126484 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:20 crc kubenswrapper[4907]: E0226 15:44:20.126520 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.130222 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.130968 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.132124 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.132751 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.133708 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.134234 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.134823 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.135887 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.136514 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.137445 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.138178 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.139370 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.139956 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.140466 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.141364 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.141874 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.142872 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.143321 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.143864 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.144823 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.145501 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.146431 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.146880 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.147960 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.148382 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.148982 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.150118 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.150558 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.151220 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.151523 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.152011 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.152838 4907 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.152951 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.154481 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.155661 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.156350 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.158058 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.159690 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.160393 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.161569 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.162916 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.164316 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.165059 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.166258 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.167020 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.168780 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.169455 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.170712 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.171724 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.172935 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.173558 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.174728 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.175787 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.176509 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.177676 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.189036 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.228181 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.267485 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.309204 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.347228 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.389513 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.427959 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.469857 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.523017 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.550865 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.588970 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.647645 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.648216 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:20 crc kubenswrapper[4907]: E0226 15:44:20.648432 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.662350 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerStarted","Data":"e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4"} Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.665125 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d"} Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.665164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89"} Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.665175 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487"} Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.691870 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.716237 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.736128 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.761049 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.798958 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.824791 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.872237 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.919061 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.943969 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:20 crc kubenswrapper[4907]: E0226 15:44:20.944229 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:20 crc kubenswrapper[4907]: E0226 15:44:20.944343 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:22.944315837 +0000 UTC m=+125.462877736 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.961702 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:20 crc kubenswrapper[4907]: I0226 15:44:20.989641 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:20Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.031104 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.070868 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.106532 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.125908 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.126054 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.148225 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.190849 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.672358 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46"} Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.675136 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ab23cfe-46ea-420e-ba6c-38ac0d2804b0" containerID="e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4" exitCode=0 Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.675286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerDied","Data":"e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4"} Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.683824 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea"} Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.683885 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b"} Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.683904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921"} Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.696025 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.727293 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.744445 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.764338 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.785148 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.807902 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.826848 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.840252 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.853002 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853096 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:44:25.853074056 +0000 UTC m=+128.371635915 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.853189 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.853237 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.853303 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.853341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853370 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853391 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853428 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853449 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853470 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:25.853459806 +0000 UTC m=+128.372021665 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853504 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:25.853487007 +0000 UTC m=+128.372048896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853643 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853665 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853682 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853735 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:25.853715253 +0000 UTC m=+128.372277152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853838 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: E0226 15:44:21.853899 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:25.853884788 +0000 UTC m=+128.372446677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.860444 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.876838 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.894923 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.909270 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.926078 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.954790 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.964484 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.977120 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:21 crc kubenswrapper[4907]: I0226 15:44:21.994073 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:21Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.008823 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.021360 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.035123 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.047532 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.066506 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.107885 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.125634 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.125657 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.125657 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:22 crc kubenswrapper[4907]: E0226 15:44:22.125736 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:22 crc kubenswrapper[4907]: E0226 15:44:22.125871 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:22 crc kubenswrapper[4907]: E0226 15:44:22.125955 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.147458 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.186611 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.228069 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.272302 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.311319 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.351453 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.391129 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.691635 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ab23cfe-46ea-420e-ba6c-38ac0d2804b0" containerID="e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176" exitCode=0 Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.691761 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerDied","Data":"e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176"} Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.724466 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.738299 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.752070 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.767884 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.784612 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.801950 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.812562 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.828669 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.851040 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.865005 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.877545 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.892020 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.910525 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.946392 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.963543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:22 crc kubenswrapper[4907]: E0226 15:44:22.963737 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:22 crc kubenswrapper[4907]: E0226 15:44:22.963790 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:26.9637739 +0000 UTC m=+129.482335749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:22 crc kubenswrapper[4907]: I0226 15:44:22.985102 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:22Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.126606 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:23 crc kubenswrapper[4907]: E0226 15:44:23.126751 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:23 crc kubenswrapper[4907]: E0226 15:44:23.219934 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.700100 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df"} Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.703245 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ab23cfe-46ea-420e-ba6c-38ac0d2804b0" containerID="bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851" exitCode=0 Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.703296 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerDied","Data":"bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851"} Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.734268 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.752437 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.770343 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.784482 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.797939 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.811983 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.829703 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.845802 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.857696 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.873690 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.893461 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.904717 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.917447 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.932050 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:23 crc kubenswrapper[4907]: I0226 15:44:23.943980 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:23Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.127119 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.127231 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:24 crc kubenswrapper[4907]: E0226 15:44:24.127345 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.127806 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:24 crc kubenswrapper[4907]: E0226 15:44:24.127903 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:24 crc kubenswrapper[4907]: E0226 15:44:24.127981 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.710012 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerStarted","Data":"73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52"} Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.730863 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.751629 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.772614 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.790241 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.804098 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.814189 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.824813 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.837741 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.852380 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.866052 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.880089 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.896413 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.915328 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.925373 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:24 crc kubenswrapper[4907]: I0226 15:44:24.935973 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:24Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.125885 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.126280 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.719515 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ab23cfe-46ea-420e-ba6c-38ac0d2804b0" containerID="73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52" exitCode=0 Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.719607 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerDied","Data":"73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52"} Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.728552 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901"} Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.729569 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.729672 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.729694 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.739081 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.755932 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.761305 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.768325 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.771889 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.785173 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.801110 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.814623 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.825895 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.838440 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.854018 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.875400 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.889968 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.895643 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.895783 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.895805 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.895833 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.895853 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.895957 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.895971 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.895980 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896021 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:33.896008151 +0000 UTC m=+136.414570000 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896550 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:44:33.896538935 +0000 UTC m=+136.415100784 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896619 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896644 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:33.896638568 +0000 UTC m=+136.415200417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896670 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896688 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:33.896682999 +0000 UTC m=+136.415244848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896721 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896729 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896736 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:25 crc kubenswrapper[4907]: E0226 15:44:25.896753 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:33.896748211 +0000 UTC m=+136.415310060 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.903798 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.922135 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.935558 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.945303 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.955834 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.966178 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.977469 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:25 crc kubenswrapper[4907]: I0226 15:44:25.989927 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:25Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.002353 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.014008 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.026824 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.039624 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.048859 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.063542 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.087447 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.098206 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.114900 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.125174 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.126276 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.126308 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.126331 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.126387 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.126476 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.126568 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.136788 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.159557 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.159625 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.159635 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.159654 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.159665 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:26Z","lastTransitionTime":"2026-02-26T15:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.171930 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.175332 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.175374 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.175385 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.175400 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.175411 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:26Z","lastTransitionTime":"2026-02-26T15:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.190853 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.194418 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.194444 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.194453 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.194468 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.194480 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:26Z","lastTransitionTime":"2026-02-26T15:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.212428 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.215766 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.215806 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.215818 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.215834 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.215846 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:26Z","lastTransitionTime":"2026-02-26T15:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.233245 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.257299 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.257342 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.257353 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.257371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.257384 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:26Z","lastTransitionTime":"2026-02-26T15:44:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.270868 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: E0226 15:44:26.271030 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.739508 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ab23cfe-46ea-420e-ba6c-38ac0d2804b0" containerID="bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688" exitCode=0 Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.739696 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerDied","Data":"bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688"} Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.755980 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.768717 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.783556 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.796813 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.810333 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.821979 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.837022 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.851582 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.862230 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.880777 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.907731 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.919879 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.931170 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.943914 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:26 crc kubenswrapper[4907]: I0226 15:44:26.954451 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:26Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.007212 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:27 crc kubenswrapper[4907]: E0226 15:44:27.007371 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:27 crc kubenswrapper[4907]: E0226 15:44:27.007441 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:35.007427104 +0000 UTC m=+137.525988953 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.126233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:27 crc kubenswrapper[4907]: E0226 15:44:27.126373 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.749525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" event={"ID":"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0","Type":"ContainerStarted","Data":"fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a"} Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.772728 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.787929 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.799931 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.811315 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.824026 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.834184 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.843895 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.852667 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.860957 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.870804 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.879543 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.889265 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.899168 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.914111 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:27 crc kubenswrapper[4907]: I0226 15:44:27.926107 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.126181 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:28 crc kubenswrapper[4907]: E0226 15:44:28.126330 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.126353 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:28 crc kubenswrapper[4907]: E0226 15:44:28.126465 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.126544 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:28 crc kubenswrapper[4907]: E0226 15:44:28.126659 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.139131 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.152494 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.170642 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.191131 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: E0226 15:44:28.220718 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.223632 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.240814 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.256175 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.274408 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.290556 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.299248 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.313870 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.329170 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.341375 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.355565 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.368029 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.756647 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/0.log" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.762411 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901" exitCode=1 Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.762501 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901"} Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.764078 4907 scope.go:117] "RemoveContainer" containerID="1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.777996 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.802559 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.819310 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.838958 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.857410 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.874439 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.889817 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.914044 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.933539 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.947620 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.968076 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:28 crc kubenswrapper[4907]: I0226 15:44:28.992699 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:28Z\\\",\\\"message\\\":\\\"6 15:44:28.320924 6668 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 15:44:28.320964 6668 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 15:44:28.321005 6668 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 15:44:28.321034 6668 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 15:44:28.321068 6668 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 15:44:28.321068 6668 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 15:44:28.321082 6668 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 15:44:28.321084 6668 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 15:44:28.321125 6668 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 15:44:28.321156 6668 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 15:44:28.321194 6668 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 15:44:28.321236 6668 factory.go:656] Stopping watch factory\\\\nI0226 15:44:28.321271 6668 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:28.321319 6668 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 15:44:28.321273 6668 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 15:44:28.321280 6668 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.003530 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.017057 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.028602 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.126272 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:29 crc kubenswrapper[4907]: E0226 15:44:29.126410 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.774797 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/0.log" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.780608 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e"} Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.781210 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.792876 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.807473 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.824702 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.845252 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.858155 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.880071 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.895246 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.916623 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.931766 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.943290 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.958430 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:29 crc kubenswrapper[4907]: I0226 15:44:29.994099 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:28Z\\\",\\\"message\\\":\\\"6 15:44:28.320924 6668 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 15:44:28.320964 6668 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 15:44:28.321005 6668 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 15:44:28.321034 6668 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 15:44:28.321068 6668 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 15:44:28.321068 6668 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 15:44:28.321082 6668 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 15:44:28.321084 6668 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 15:44:28.321125 6668 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 15:44:28.321156 6668 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 15:44:28.321194 6668 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 15:44:28.321236 6668 factory.go:656] Stopping watch factory\\\\nI0226 15:44:28.321271 6668 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:28.321319 6668 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 15:44:28.321273 6668 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 15:44:28.321280 6668 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:29Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.011882 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.027550 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.039282 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.126741 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.126772 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.126799 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:30 crc kubenswrapper[4907]: E0226 15:44:30.126903 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:30 crc kubenswrapper[4907]: E0226 15:44:30.127018 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:30 crc kubenswrapper[4907]: E0226 15:44:30.127189 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.785734 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/1.log" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.786572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/0.log" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.790492 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e" exitCode=1 Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.790544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e"} Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.790627 4907 scope.go:117] "RemoveContainer" containerID="1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.792112 4907 scope.go:117] "RemoveContainer" containerID="4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e" Feb 26 15:44:30 crc kubenswrapper[4907]: E0226 15:44:30.792389 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.815212 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.834513 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.851899 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.865114 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.882715 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.903489 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e116520caca612676e0ffd45f87c01164c8b3c7e1b65b934126f251206d7901\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:28Z\\\",\\\"message\\\":\\\"6 15:44:28.320924 6668 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 15:44:28.320964 6668 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0226 15:44:28.321005 6668 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0226 15:44:28.321034 6668 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0226 15:44:28.321068 6668 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 15:44:28.321068 6668 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 15:44:28.321082 6668 handler.go:208] Removed *v1.Node event handler 7\\\\nI0226 15:44:28.321084 6668 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0226 15:44:28.321125 6668 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0226 15:44:28.321156 6668 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0226 15:44:28.321194 6668 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0226 15:44:28.321236 6668 factory.go:656] Stopping watch factory\\\\nI0226 15:44:28.321271 6668 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:28.321319 6668 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0226 15:44:28.321273 6668 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0226 15:44:28.321280 6668 handler.go:208] Removed *v1.Node event handler 2\\\\nI0226 15\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:30Z\\\",\\\"message\\\":\\\" k8s.ovn.org/owner:openshift-operator-lifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 15:44:29.594454 6827 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 15:44:29.594754 6827 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0226 15:44:29.595204 6827 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:29.595272 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 15:44:29.595399 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.915893 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.932582 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.948472 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.966679 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.980401 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:30 crc kubenswrapper[4907]: I0226 15:44:30.994656 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:30Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.008231 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.021256 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.059389 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.126348 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:31 crc kubenswrapper[4907]: E0226 15:44:31.126637 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.796218 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/1.log" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.807173 4907 scope.go:117] "RemoveContainer" containerID="4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e" Feb 26 15:44:31 crc kubenswrapper[4907]: E0226 15:44:31.807388 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.822898 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.836743 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.853879 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.873148 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.896042 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.909725 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.924786 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.941646 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.955024 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.969309 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.985562 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:30Z\\\",\\\"message\\\":\\\" k8s.ovn.org/owner:openshift-operator-lifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 15:44:29.594454 6827 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 15:44:29.594754 6827 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0226 15:44:29.595204 6827 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:29.595272 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 15:44:29.595399 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:31 crc kubenswrapper[4907]: I0226 15:44:31.996420 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:31Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.006208 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:32Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.014818 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:32Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.022898 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:32Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.125986 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.126065 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:32 crc kubenswrapper[4907]: E0226 15:44:32.126503 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.126688 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:32 crc kubenswrapper[4907]: I0226 15:44:32.126809 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:32 crc kubenswrapper[4907]: E0226 15:44:32.126886 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:32 crc kubenswrapper[4907]: E0226 15:44:32.127005 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:32 crc kubenswrapper[4907]: E0226 15:44:32.127014 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:33 crc kubenswrapper[4907]: I0226 15:44:33.125859 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.126074 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.222787 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:33 crc kubenswrapper[4907]: I0226 15:44:33.976651 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.976874 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:44:49.976834453 +0000 UTC m=+152.495396332 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:33 crc kubenswrapper[4907]: I0226 15:44:33.977561 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.977842 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: I0226 15:44:33.977878 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:33 crc kubenswrapper[4907]: I0226 15:44:33.977963 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.977899 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978041 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:33 crc kubenswrapper[4907]: I0226 15:44:33.978056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978116 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:49.978094768 +0000 UTC m=+152.496656657 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978225 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978257 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978278 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978344 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:49.978322344 +0000 UTC m=+152.496884233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978446 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978506 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:49.978484838 +0000 UTC m=+152.497046727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.977994 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:33 crc kubenswrapper[4907]: E0226 15:44:33.978567 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:49.97855242 +0000 UTC m=+152.497114339 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:34 crc kubenswrapper[4907]: I0226 15:44:34.125980 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:34 crc kubenswrapper[4907]: I0226 15:44:34.126065 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:34 crc kubenswrapper[4907]: I0226 15:44:34.126166 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:34 crc kubenswrapper[4907]: E0226 15:44:34.127944 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:34 crc kubenswrapper[4907]: E0226 15:44:34.128042 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:34 crc kubenswrapper[4907]: E0226 15:44:34.127837 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:35 crc kubenswrapper[4907]: I0226 15:44:35.103307 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:35 crc kubenswrapper[4907]: E0226 15:44:35.103626 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:35 crc kubenswrapper[4907]: E0226 15:44:35.103703 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:44:51.103678911 +0000 UTC m=+153.622240800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:35 crc kubenswrapper[4907]: I0226 15:44:35.125833 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:35 crc kubenswrapper[4907]: E0226 15:44:35.126021 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.126185 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.126801 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.126258 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.126983 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.126258 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.127184 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.432360 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.432674 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.433711 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.433926 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.434038 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:36Z","lastTransitionTime":"2026-02-26T15:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.448349 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:36Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.451762 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.451889 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.451963 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.452029 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.452088 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:36Z","lastTransitionTime":"2026-02-26T15:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.466438 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:36Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.470550 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.470607 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.470618 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.470632 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.470640 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:36Z","lastTransitionTime":"2026-02-26T15:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.489021 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:36Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.492893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.492931 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.492942 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.492959 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.492970 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:36Z","lastTransitionTime":"2026-02-26T15:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.505156 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:36Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.508676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.508712 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.508720 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.508734 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:36 crc kubenswrapper[4907]: I0226 15:44:36.508744 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:36Z","lastTransitionTime":"2026-02-26T15:44:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.524055 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:36Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:36 crc kubenswrapper[4907]: E0226 15:44:36.524285 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:44:37 crc kubenswrapper[4907]: I0226 15:44:37.125707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:37 crc kubenswrapper[4907]: E0226 15:44:37.125890 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.126422 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.126445 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:38 crc kubenswrapper[4907]: E0226 15:44:38.126729 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.126499 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:38 crc kubenswrapper[4907]: E0226 15:44:38.126899 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:38 crc kubenswrapper[4907]: E0226 15:44:38.127029 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.145816 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.165444 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.189436 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:30Z\\\",\\\"message\\\":\\\" k8s.ovn.org/owner:openshift-operator-lifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 15:44:29.594454 6827 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 15:44:29.594754 6827 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0226 15:44:29.595204 6827 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:29.595272 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 15:44:29.595399 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.207781 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: E0226 15:44:38.223412 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.227887 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.250978 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.270254 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.286748 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.298554 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.316420 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.330318 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.347509 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.362662 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.380751 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:38 crc kubenswrapper[4907]: I0226 15:44:38.404298 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:38Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:39 crc kubenswrapper[4907]: I0226 15:44:39.126173 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:39 crc kubenswrapper[4907]: E0226 15:44:39.126340 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:40 crc kubenswrapper[4907]: I0226 15:44:40.126450 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:40 crc kubenswrapper[4907]: I0226 15:44:40.126527 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:40 crc kubenswrapper[4907]: I0226 15:44:40.126697 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:40 crc kubenswrapper[4907]: E0226 15:44:40.126740 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:40 crc kubenswrapper[4907]: E0226 15:44:40.126891 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:40 crc kubenswrapper[4907]: E0226 15:44:40.127001 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:41 crc kubenswrapper[4907]: I0226 15:44:41.126010 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:41 crc kubenswrapper[4907]: E0226 15:44:41.126340 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:42 crc kubenswrapper[4907]: I0226 15:44:42.125819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:42 crc kubenswrapper[4907]: I0226 15:44:42.125879 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:42 crc kubenswrapper[4907]: I0226 15:44:42.125839 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:42 crc kubenswrapper[4907]: E0226 15:44:42.126130 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:42 crc kubenswrapper[4907]: E0226 15:44:42.126243 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:42 crc kubenswrapper[4907]: E0226 15:44:42.126391 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:42 crc kubenswrapper[4907]: I0226 15:44:42.135466 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 15:44:43 crc kubenswrapper[4907]: I0226 15:44:43.125758 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:43 crc kubenswrapper[4907]: E0226 15:44:43.125898 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:43 crc kubenswrapper[4907]: E0226 15:44:43.224827 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.125949 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.126153 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:44 crc kubenswrapper[4907]: E0226 15:44:44.126167 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:44 crc kubenswrapper[4907]: E0226 15:44:44.126273 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.126715 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:44 crc kubenswrapper[4907]: E0226 15:44:44.126841 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.127358 4907 scope.go:117] "RemoveContainer" containerID="4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.144027 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.850071 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/1.log" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.852546 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e"} Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.866354 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.878006 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.886737 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.898118 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.917780 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:30Z\\\",\\\"message\\\":\\\" k8s.ovn.org/owner:openshift-operator-lifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 15:44:29.594454 6827 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 15:44:29.594754 6827 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0226 15:44:29.595204 6827 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:29.595272 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 15:44:29.595399 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.929844 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.942102 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.958238 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.970050 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.983694 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:44 crc kubenswrapper[4907]: I0226 15:44:44.996572 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:44Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.008705 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.021850 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.040769 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.058262 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.069532 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.082319 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.125960 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:45 crc kubenswrapper[4907]: E0226 15:44:45.126083 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.866776 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/2.log" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.868166 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/1.log" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.873392 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e" exitCode=1 Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.873467 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e"} Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.873550 4907 scope.go:117] "RemoveContainer" containerID="4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.875178 4907 scope.go:117] "RemoveContainer" containerID="ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e" Feb 26 15:44:45 crc kubenswrapper[4907]: E0226 15:44:45.875583 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.895280 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.922050 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.940836 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.958132 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.974448 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:45 crc kubenswrapper[4907]: I0226 15:44:45.993042 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.024923 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:30Z\\\",\\\"message\\\":\\\" k8s.ovn.org/owner:openshift-operator-lifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 15:44:29.594454 6827 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 15:44:29.594754 6827 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0226 15:44:29.595204 6827 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:29.595272 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 15:44:29.595399 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.043052 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.059044 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.076145 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.091107 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.105980 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.122328 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.126156 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.126167 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.126267 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.126413 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.126660 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.127010 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.127409 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.127679 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.144338 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.166359 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.187162 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.216108 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.646573 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.646978 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.647223 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.647419 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.647632 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:46Z","lastTransitionTime":"2026-02-26T15:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.670291 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.675662 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.675722 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.675746 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.675775 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.675798 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:46Z","lastTransitionTime":"2026-02-26T15:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.699150 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.704177 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.704237 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.704262 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.704291 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.704313 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:46Z","lastTransitionTime":"2026-02-26T15:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.724658 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.729425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.729481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.729504 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.729531 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.729552 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:46Z","lastTransitionTime":"2026-02-26T15:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.751053 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.755794 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.755831 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.755847 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.755868 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.755882 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:46Z","lastTransitionTime":"2026-02-26T15:44:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.771407 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:46Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:46 crc kubenswrapper[4907]: E0226 15:44:46.771682 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:44:46 crc kubenswrapper[4907]: I0226 15:44:46.879121 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/2.log" Feb 26 15:44:47 crc kubenswrapper[4907]: I0226 15:44:47.126675 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:47 crc kubenswrapper[4907]: E0226 15:44:47.126886 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.126616 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:48 crc kubenswrapper[4907]: E0226 15:44:48.126949 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.126698 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:48 crc kubenswrapper[4907]: E0226 15:44:48.127485 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.126637 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:48 crc kubenswrapper[4907]: E0226 15:44:48.127714 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.148783 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.167215 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.186791 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.201378 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.219485 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: E0226 15:44:48.226708 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.243193 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4d55419b62d561963ddf391be91eac8ed1ae59e1cd1364aa55293460cd574d3e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:30Z\\\",\\\"message\\\":\\\" k8s.ovn.org/owner:openshift-operator-lifecycle-manager/catalog-operator-metrics]} name:Service_openshift-operator-lifecycle-manager/catalog-operator-metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.204:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {78f6184b-c7cf-436d-8cbb-4b31f8af75e8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 15:44:29.594454 6827 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 15:44:29.594754 6827 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0226 15:44:29.595204 6827 ovnkube.go:599] Stopped ovnkube\\\\nI0226 15:44:29.595272 6827 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0226 15:44:29.595399 6827 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.258624 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.273532 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.287399 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.301500 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.316350 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.330966 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.343635 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.359439 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.375493 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.391250 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.409195 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.495711 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.496726 4907 scope.go:117] "RemoveContainer" containerID="ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e" Feb 26 15:44:48 crc kubenswrapper[4907]: E0226 15:44:48.496912 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.516649 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.534258 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.551718 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.567787 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.583183 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.600006 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.617634 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.638557 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.657456 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.688033 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.703186 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.719102 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.736333 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.755503 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.768888 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.784573 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:48 crc kubenswrapper[4907]: I0226 15:44:48.796689 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:48Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:49 crc kubenswrapper[4907]: I0226 15:44:49.126475 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:49 crc kubenswrapper[4907]: E0226 15:44:49.126642 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.073448 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.073667 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:45:22.073640711 +0000 UTC m=+184.592202560 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.073907 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.073929 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.073961 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.073981 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074052 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074107 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:45:22.074099475 +0000 UTC m=+184.592661324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074105 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074105 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074130 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074226 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074210 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:45:22.074187557 +0000 UTC m=+184.592749486 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074268 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:45:22.074262129 +0000 UTC m=+184.592823978 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074289 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074328 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074340 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.074414 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:45:22.074393193 +0000 UTC m=+184.592955102 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.126498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.126515 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.126789 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:50 crc kubenswrapper[4907]: I0226 15:44:50.126834 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.126977 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:50 crc kubenswrapper[4907]: E0226 15:44:50.127183 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:51 crc kubenswrapper[4907]: I0226 15:44:51.126062 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:51 crc kubenswrapper[4907]: E0226 15:44:51.126264 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:51 crc kubenswrapper[4907]: I0226 15:44:51.186198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:51 crc kubenswrapper[4907]: E0226 15:44:51.186427 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:51 crc kubenswrapper[4907]: E0226 15:44:51.186516 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:45:23.186495015 +0000 UTC m=+185.705056864 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:44:52 crc kubenswrapper[4907]: I0226 15:44:52.125516 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:52 crc kubenswrapper[4907]: I0226 15:44:52.125661 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:52 crc kubenswrapper[4907]: I0226 15:44:52.125566 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:52 crc kubenswrapper[4907]: E0226 15:44:52.125821 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:52 crc kubenswrapper[4907]: E0226 15:44:52.125924 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:52 crc kubenswrapper[4907]: E0226 15:44:52.126022 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:53 crc kubenswrapper[4907]: I0226 15:44:53.126330 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:53 crc kubenswrapper[4907]: E0226 15:44:53.126516 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:53 crc kubenswrapper[4907]: E0226 15:44:53.227959 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:54 crc kubenswrapper[4907]: I0226 15:44:54.125663 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:54 crc kubenswrapper[4907]: E0226 15:44:54.125784 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:54 crc kubenswrapper[4907]: I0226 15:44:54.125669 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:54 crc kubenswrapper[4907]: E0226 15:44:54.125857 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:54 crc kubenswrapper[4907]: I0226 15:44:54.125671 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:54 crc kubenswrapper[4907]: E0226 15:44:54.125904 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:55 crc kubenswrapper[4907]: I0226 15:44:55.126422 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:55 crc kubenswrapper[4907]: E0226 15:44:55.126655 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.126014 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.126264 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.126845 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.127125 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.127748 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.127917 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.786017 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.786084 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.786108 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.786138 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.786160 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:56Z","lastTransitionTime":"2026-02-26T15:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.807939 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:56Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.813344 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.813417 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.813435 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.813460 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.813476 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:56Z","lastTransitionTime":"2026-02-26T15:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.834141 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:56Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.839088 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.839220 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.839241 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.839269 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.839287 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:56Z","lastTransitionTime":"2026-02-26T15:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.859387 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:56Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.864548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.864649 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.864674 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.864705 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.864729 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:56Z","lastTransitionTime":"2026-02-26T15:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.883666 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:56Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.888802 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.888874 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.888893 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.888929 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:44:56 crc kubenswrapper[4907]: I0226 15:44:56.888947 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:44:56Z","lastTransitionTime":"2026-02-26T15:44:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.913023 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:56Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:56 crc kubenswrapper[4907]: E0226 15:44:56.913177 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:44:57 crc kubenswrapper[4907]: I0226 15:44:57.125948 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:57 crc kubenswrapper[4907]: E0226 15:44:57.126159 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.126550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.126661 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:44:58 crc kubenswrapper[4907]: E0226 15:44:58.126804 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.126625 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:44:58 crc kubenswrapper[4907]: E0226 15:44:58.127037 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:44:58 crc kubenswrapper[4907]: E0226 15:44:58.127185 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.128120 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:44:58 crc kubenswrapper[4907]: E0226 15:44:58.128448 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.146400 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.168539 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.202367 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.223730 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: E0226 15:44:58.229049 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.248231 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.270562 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.286464 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.300289 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.315475 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.332493 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.347236 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.362787 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.379846 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.396343 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.416453 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.437165 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:58 crc kubenswrapper[4907]: I0226 15:44:58.461715 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:58Z is after 2025-08-24T17:21:41Z" Feb 26 15:44:59 crc kubenswrapper[4907]: I0226 15:44:59.125844 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:44:59 crc kubenswrapper[4907]: E0226 15:44:59.126097 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:00 crc kubenswrapper[4907]: I0226 15:45:00.126723 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:00 crc kubenswrapper[4907]: I0226 15:45:00.126818 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:00 crc kubenswrapper[4907]: I0226 15:45:00.126899 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:00 crc kubenswrapper[4907]: E0226 15:45:00.128270 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:00 crc kubenswrapper[4907]: E0226 15:45:00.128391 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:00 crc kubenswrapper[4907]: E0226 15:45:00.128433 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:01 crc kubenswrapper[4907]: I0226 15:45:01.125923 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:01 crc kubenswrapper[4907]: E0226 15:45:01.126231 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:02 crc kubenswrapper[4907]: I0226 15:45:02.126202 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:02 crc kubenswrapper[4907]: I0226 15:45:02.126334 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:02 crc kubenswrapper[4907]: E0226 15:45:02.126532 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:02 crc kubenswrapper[4907]: I0226 15:45:02.126729 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:02 crc kubenswrapper[4907]: E0226 15:45:02.127418 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:02 crc kubenswrapper[4907]: E0226 15:45:02.127524 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:02 crc kubenswrapper[4907]: I0226 15:45:02.128024 4907 scope.go:117] "RemoveContainer" containerID="ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e" Feb 26 15:45:02 crc kubenswrapper[4907]: E0226 15:45:02.128352 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:45:03 crc kubenswrapper[4907]: I0226 15:45:03.126074 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:03 crc kubenswrapper[4907]: E0226 15:45:03.126665 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:03 crc kubenswrapper[4907]: E0226 15:45:03.230913 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:04 crc kubenswrapper[4907]: I0226 15:45:04.125721 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:04 crc kubenswrapper[4907]: I0226 15:45:04.125797 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:04 crc kubenswrapper[4907]: I0226 15:45:04.125798 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:04 crc kubenswrapper[4907]: E0226 15:45:04.126168 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:04 crc kubenswrapper[4907]: E0226 15:45:04.126325 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:04 crc kubenswrapper[4907]: E0226 15:45:04.126436 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:04 crc kubenswrapper[4907]: I0226 15:45:04.138563 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 26 15:45:05 crc kubenswrapper[4907]: I0226 15:45:05.125911 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:05 crc kubenswrapper[4907]: E0226 15:45:05.126278 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.126523 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.126522 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.126723 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:06 crc kubenswrapper[4907]: E0226 15:45:06.126893 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:06 crc kubenswrapper[4907]: E0226 15:45:06.127313 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:06 crc kubenswrapper[4907]: E0226 15:45:06.127675 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.926361 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.926438 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.926460 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.926490 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.926511 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:06Z","lastTransitionTime":"2026-02-26T15:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:06 crc kubenswrapper[4907]: E0226 15:45:06.946668 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:06Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.951481 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.951899 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.951922 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.951943 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.951958 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:06Z","lastTransitionTime":"2026-02-26T15:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.955527 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/0.log" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.955609 4907 generic.go:334] "Generic (PLEG): container finished" podID="51024bd5-00ff-4e2f-927c-8c989b59d7be" containerID="9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da" exitCode=1 Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.955649 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerDied","Data":"9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da"} Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.956168 4907 scope.go:117] "RemoveContainer" containerID="9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.971952 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:06Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:06 crc kubenswrapper[4907]: E0226 15:45:06.978029 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:06Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.982438 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.982498 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.982520 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.982548 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.982571 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:06Z","lastTransitionTime":"2026-02-26T15:45:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:06 crc kubenswrapper[4907]: I0226 15:45:06.996661 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:06Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: E0226 15:45:07.003538 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.007925 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.008123 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.008221 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.008322 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.008417 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:07Z","lastTransitionTime":"2026-02-26T15:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.016465 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: E0226 15:45:07.026812 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.031933 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.032129 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.032203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.032266 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.032327 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:07Z","lastTransitionTime":"2026-02-26T15:45:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.035449 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.049533 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: E0226 15:45:07.051722 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: E0226 15:45:07.052223 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.071717 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.085127 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.102193 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.115965 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.126123 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:07 crc kubenswrapper[4907]: E0226 15:45:07.126282 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.129833 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.150723 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.163223 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.175203 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.186053 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.205670 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.222451 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.237485 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.247640 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.964085 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/0.log" Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.964174 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerStarted","Data":"e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f"} Feb 26 15:45:07 crc kubenswrapper[4907]: I0226 15:45:07.985771 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:07Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.005944 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.027829 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.048316 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.067453 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.092088 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.112919 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.126576 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.126632 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:08 crc kubenswrapper[4907]: E0226 15:45:08.126782 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.126825 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:08 crc kubenswrapper[4907]: E0226 15:45:08.126995 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:08 crc kubenswrapper[4907]: E0226 15:45:08.127103 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.139365 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.157670 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.176288 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.192936 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.209133 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: E0226 15:45:08.232176 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.238181 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.260743 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.277023 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.293697 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.325571 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.339313 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.354423 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.367518 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.381351 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.390920 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.405065 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.418991 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.433351 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.444187 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.461612 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.477215 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.488719 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.501116 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.519689 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.531371 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.543371 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.559121 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.573367 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:08 crc kubenswrapper[4907]: I0226 15:45:08.582228 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:08Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:09 crc kubenswrapper[4907]: I0226 15:45:09.126442 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:09 crc kubenswrapper[4907]: E0226 15:45:09.126638 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:09 crc kubenswrapper[4907]: I0226 15:45:09.140023 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 26 15:45:10 crc kubenswrapper[4907]: I0226 15:45:10.125824 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:10 crc kubenswrapper[4907]: I0226 15:45:10.125958 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:10 crc kubenswrapper[4907]: E0226 15:45:10.125989 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:10 crc kubenswrapper[4907]: E0226 15:45:10.126159 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:10 crc kubenswrapper[4907]: I0226 15:45:10.126362 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:10 crc kubenswrapper[4907]: E0226 15:45:10.126446 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:11 crc kubenswrapper[4907]: I0226 15:45:11.126470 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:11 crc kubenswrapper[4907]: E0226 15:45:11.127714 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:12 crc kubenswrapper[4907]: I0226 15:45:12.126543 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:12 crc kubenswrapper[4907]: E0226 15:45:12.126767 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:12 crc kubenswrapper[4907]: I0226 15:45:12.127238 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:12 crc kubenswrapper[4907]: E0226 15:45:12.127439 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:12 crc kubenswrapper[4907]: I0226 15:45:12.127819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:12 crc kubenswrapper[4907]: E0226 15:45:12.128006 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:13 crc kubenswrapper[4907]: I0226 15:45:13.126305 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:13 crc kubenswrapper[4907]: E0226 15:45:13.126533 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:13 crc kubenswrapper[4907]: I0226 15:45:13.127340 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:45:13 crc kubenswrapper[4907]: E0226 15:45:13.127640 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:45:13 crc kubenswrapper[4907]: E0226 15:45:13.233916 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:14 crc kubenswrapper[4907]: I0226 15:45:14.126395 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:14 crc kubenswrapper[4907]: I0226 15:45:14.126421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:14 crc kubenswrapper[4907]: E0226 15:45:14.126662 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:14 crc kubenswrapper[4907]: E0226 15:45:14.126714 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:14 crc kubenswrapper[4907]: I0226 15:45:14.127183 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:14 crc kubenswrapper[4907]: E0226 15:45:14.127283 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:15 crc kubenswrapper[4907]: I0226 15:45:15.125881 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:15 crc kubenswrapper[4907]: E0226 15:45:15.126513 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:16 crc kubenswrapper[4907]: I0226 15:45:16.126398 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:16 crc kubenswrapper[4907]: I0226 15:45:16.126505 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:16 crc kubenswrapper[4907]: I0226 15:45:16.127032 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:16 crc kubenswrapper[4907]: E0226 15:45:16.127223 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:16 crc kubenswrapper[4907]: E0226 15:45:16.127343 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:16 crc kubenswrapper[4907]: E0226 15:45:16.127485 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:16 crc kubenswrapper[4907]: I0226 15:45:16.127579 4907 scope.go:117] "RemoveContainer" containerID="ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.000990 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/2.log" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.003866 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae"} Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.004630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.017915 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.035579 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.051335 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.070629 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087bfdc5-a69f-41c0-912b-10827f34927b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d6cb50daf3d05a3e4b4427361206adaeb990478e437b697db9a2716fbc0a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a65767b486307851169c93586cffb785a0977b0ca654dc7bc6fd38ce349d5f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b642a813d8a9d885593d5dd495ed461119f14e1c1937844b64196bb55dd67e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e03a798a371431d5f0e490e8ffe260ea101ae6a41f56f9ee2d37c2ed255f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d111022be1d13de640f2fe6f3683455c1defed82f3c06fb63c8b84d2feea1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.072224 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.072288 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.072304 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.072330 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.072347 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:17Z","lastTransitionTime":"2026-02-26T15:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.085114 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.095217 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.100429 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.100478 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.100489 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.100510 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.100522 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:17Z","lastTransitionTime":"2026-02-26T15:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.105744 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.117548 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.123080 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.123113 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.123121 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.123135 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.123149 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:17Z","lastTransitionTime":"2026-02-26T15:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.123134 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.127142 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.127327 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.134297 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.138289 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.138331 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.138347 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.138371 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.138389 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:17Z","lastTransitionTime":"2026-02-26T15:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.142539 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.152339 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.159367 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.159425 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.159442 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.159477 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.159492 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:17Z","lastTransitionTime":"2026-02-26T15:45:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.162526 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.173866 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.176757 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: E0226 15:45:17.177014 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.189645 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.210359 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.220973 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.230666 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.240897 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.256054 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.269166 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.278763 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:17 crc kubenswrapper[4907]: I0226 15:45:17.288782 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.010052 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/3.log" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.010879 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/2.log" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.014548 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" exitCode=1 Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.014626 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae"} Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.014673 4907 scope.go:117] "RemoveContainer" containerID="ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.015644 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:45:18 crc kubenswrapper[4907]: E0226 15:45:18.015938 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.033005 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.063023 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087bfdc5-a69f-41c0-912b-10827f34927b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d6cb50daf3d05a3e4b4427361206adaeb990478e437b697db9a2716fbc0a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a65767b486307851169c93586cffb785a0977b0ca654dc7bc6fd38ce349d5f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b642a813d8a9d885593d5dd495ed461119f14e1c1937844b64196bb55dd67e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e03a798a371431d5f0e490e8ffe260ea101ae6a41f56f9ee2d37c2ed255f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d111022be1d13de640f2fe6f3683455c1defed82f3c06fb63c8b84d2feea1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.082308 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.096293 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.110480 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.126135 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:18 crc kubenswrapper[4907]: E0226 15:45:18.126462 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.126830 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:18 crc kubenswrapper[4907]: E0226 15:45:18.127052 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.127451 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:18 crc kubenswrapper[4907]: E0226 15:45:18.130331 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.131547 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.146252 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.162086 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.175972 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.190130 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.209974 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.226794 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: E0226 15:45:18.234966 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.243792 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.256219 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.274498 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.300692 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"shift-image-registry/node-ca-9gtgp\\\\nI0226 15:45:17.094851 7329 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9gtgp\\\\nI0226 15:45:17.094858 7329 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9gtgp in node crc\\\\nI0226 15:45:17.094867 7329 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-9gtgp after 0 failed attempt(s)\\\\nI0226 15:45:17.094873 7329 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-9gtgp\\\\nF0226 15:45:17.094874 7329 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.314111 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.327962 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.336941 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.355220 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.367453 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.389086 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087bfdc5-a69f-41c0-912b-10827f34927b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d6cb50daf3d05a3e4b4427361206adaeb990478e437b697db9a2716fbc0a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a65767b486307851169c93586cffb785a0977b0ca654dc7bc6fd38ce349d5f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b642a813d8a9d885593d5dd495ed461119f14e1c1937844b64196bb55dd67e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e03a798a371431d5f0e490e8ffe260ea101ae6a41f56f9ee2d37c2ed255f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d111022be1d13de640f2fe6f3683455c1defed82f3c06fb63c8b84d2feea1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.407915 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.424630 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.438307 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.454555 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.470002 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.483518 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.499082 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.528096 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed7db8f0288f2b3a14da208935b54a6702d7b68a6ec301250f2ebb9519354f9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:44:45Z\\\",\\\"message\\\":\\\"org/kind:Service k8s.ovn.org/owner:openshift-authentication/oauth-openshift]} name:Service_openshift-authentication/oauth-openshift_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.222:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c0c2f725-e461-454e-a88c-c8350d62e1ef}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 15:44:45.034086 7002 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:44:45Z is after 2025-0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:44Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"shift-image-registry/node-ca-9gtgp\\\\nI0226 15:45:17.094851 7329 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9gtgp\\\\nI0226 15:45:17.094858 7329 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9gtgp in node crc\\\\nI0226 15:45:17.094867 7329 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-9gtgp after 0 failed attempt(s)\\\\nI0226 15:45:17.094873 7329 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-9gtgp\\\\nF0226 15:45:17.094874 7329 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:45:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.549085 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.564049 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.582246 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.596781 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.612528 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.623705 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.637841 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:18 crc kubenswrapper[4907]: I0226 15:45:18.648961 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:18Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.021499 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/3.log" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.026227 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:45:19 crc kubenswrapper[4907]: E0226 15:45:19.026381 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.044160 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.055742 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.069137 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"432281c6-dcf8-4471-9801-9194000a9abd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a751c325fc4b5b8668afd084530efeddd36543db3710b4d5ab525dc8e572bb1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00c5078cb42e7e369ed71d8867be75c4f1bf473eae40d151eacbeda76980196c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wrq6z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-s9f9w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.085951 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.109954 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087bfdc5-a69f-41c0-912b-10827f34927b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d6cb50daf3d05a3e4b4427361206adaeb990478e437b697db9a2716fbc0a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a65767b486307851169c93586cffb785a0977b0ca654dc7bc6fd38ce349d5f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b642a813d8a9d885593d5dd495ed461119f14e1c1937844b64196bb55dd67e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e03a798a371431d5f0e490e8ffe260ea101ae6a41f56f9ee2d37c2ed255f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d111022be1d13de640f2fe6f3683455c1defed82f3c06fb63c8b84d2feea1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.127943 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:19 crc kubenswrapper[4907]: E0226 15:45:19.128240 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.160369 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3ab23cfe-46ea-420e-ba6c-38ac0d2804b0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fa50b3ce686f099f6b9ed4dcb642c118a6294d2e92cfdbf59339d106c9052d1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://608b79bf33a420a12900e4bce6e593b17cfa7c3e9ebbcc9378833dce3a84e31d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e89433e3d1fc270f03f4dba736b947b987980198cfe9e4f66865ab6222ce82f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e31f3856c094e119772c90aaa64b7decc756b6da339efc3d406daeaa8b274176\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfc88f0a13f82a4a192745b9a3eac44fea007542c73923ca729d6fd6336c1851\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73cba4d9193c3840f98e95371a1cda6f5264d73d631ef29664dfd1b0f9852b52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf00572269494256a1a7b40277ce094962baaa145f2147dde7870e4c19b8f688\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2vfj6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-b2qgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.186348 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"917eebf3-db36-47b8-af0a-b80d042fddab\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f195a8a6d014276c4202f3995d294fe5026b640273192a6f463642b79d4ddda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lmpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-v5ng6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.204252 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b385be8ca84800beda307aea098ce9f4e640cd4b6c7bd2856c75b1a4193cb655\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://acf341c3480df31c1b94ef2f3feb5a3e7eef3fa85ef3292ad0e5ef70a4575cb3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.219018 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.233191 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e574efe4067ea713788905c2bd40d7ff4ed75353c577df5ee8ca730d5037434\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.244748 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9gtgp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ae882fbf-ac76-4363-a10c-60eaf80ee7c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://78c4268a57d845c79f2bf6b5e3742785efea137f2b0b3c37cb1b6fc54274e30f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xl77m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9gtgp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.258883 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2gl5t" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"51024bd5-00ff-4e2f-927c-8c989b59d7be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:06Z\\\",\\\"message\\\":\\\"2026-02-26T15:44:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044\\\\n2026-02-26T15:44:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b8a74968-e1a4-4746-b2bd-e84f4f6ec044 to /host/opt/cni/bin/\\\\n2026-02-26T15:44:21Z [verbose] multus-daemon started\\\\n2026-02-26T15:44:21Z [verbose] Readiness Indicator file check\\\\n2026-02-26T15:45:06Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:45:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fx5n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2gl5t\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.276293 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"49ee65e1-8667-4ad7-a403-c899f0cc6a70\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T15:45:17Z\\\",\\\"message\\\":\\\"shift-image-registry/node-ca-9gtgp\\\\nI0226 15:45:17.094851 7329 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-9gtgp\\\\nI0226 15:45:17.094858 7329 ovn.go:134] Ensuring zone local for Pod openshift-image-registry/node-ca-9gtgp in node crc\\\\nI0226 15:45:17.094867 7329 obj_retry.go:386] Retry successful for *v1.Pod openshift-image-registry/node-ca-9gtgp after 0 failed attempt(s)\\\\nI0226 15:45:17.094873 7329 default_network_controller.go:776] Recording success event on pod openshift-image-registry/node-ca-9gtgp\\\\nF0226 15:45:17.094874 7329 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:17Z \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:45:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7hmb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-vsvsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.289240 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.301022 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9637349a18a137859d53c939993c64cd1275117aeab8d855be9498820d9ec46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.313498 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c14dd1f-1741-447b-ad4f-ce34e0d5bd63\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2413429d3f7edf75cdb8cd2cb7fe17b4f9c5017c7a2926764186e1d65e44228d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f3f4eb948df3626824724fd4883ad9e04fb96bb8f74f33a8367a1d6f1dc9ae8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.328494 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"27c9ab80-fcc8-4c5a-9d89-c0504e0e6396\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:44:12Z\\\",\\\"message\\\":\\\"le observer\\\\nW0226 15:44:11.651017 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 15:44:11.651151 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 15:44:11.653054 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1720683088/tls.crt::/tmp/serving-cert-1720683088/tls.key\\\\\\\"\\\\nI0226 15:44:12.242500 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 15:44:12.245173 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 15:44:12.245192 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 15:44:12.245214 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 15:44:12.245219 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 15:44:12.248257 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0226 15:44:12.248276 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248281 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 15:44:12.248286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 15:44:12.248289 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 15:44:12.248292 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 15:44:12.248295 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0226 15:44:12.248403 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0226 15:44:12.250972 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:44:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.339075 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd06f422-2c09-4da9-843c-75525df52517\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dbhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zsb5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:19 crc kubenswrapper[4907]: I0226 15:45:19.349802 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-958vt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4569fec7-a859-4a9e-b9d9-34ccc7c6be9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:44:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c9c60e926f3c2412b5a8698e82e161e6e34373a3e6b471698cb521b9e494871\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:44:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8nhj9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:44:18Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-958vt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:19Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:20 crc kubenswrapper[4907]: I0226 15:45:20.126139 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:20 crc kubenswrapper[4907]: I0226 15:45:20.126196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:20 crc kubenswrapper[4907]: E0226 15:45:20.126382 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:20 crc kubenswrapper[4907]: E0226 15:45:20.126616 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:20 crc kubenswrapper[4907]: I0226 15:45:20.126911 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:20 crc kubenswrapper[4907]: E0226 15:45:20.127066 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:21 crc kubenswrapper[4907]: I0226 15:45:21.126275 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:21 crc kubenswrapper[4907]: E0226 15:45:21.126468 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.126176 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.126689 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.126223 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.126863 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.126184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.127017 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.145697 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.145854 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.145909 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.145955 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:22 crc kubenswrapper[4907]: I0226 15:45:22.145989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146151 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146175 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146194 4907 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146257 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.146235274 +0000 UTC m=+248.664797163 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146526 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.146504831 +0000 UTC m=+248.665066720 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146553 4907 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146630 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.146615474 +0000 UTC m=+248.665177333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146721 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146756 4907 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146751 4907 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146777 4907 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146863 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.146839051 +0000 UTC m=+248.665400940 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 15:45:22 crc kubenswrapper[4907]: E0226 15:45:22.146899 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.146881582 +0000 UTC m=+248.665443471 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 15:45:23 crc kubenswrapper[4907]: I0226 15:45:23.125579 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:23 crc kubenswrapper[4907]: E0226 15:45:23.125835 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:23 crc kubenswrapper[4907]: E0226 15:45:23.236462 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:23 crc kubenswrapper[4907]: I0226 15:45:23.257383 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:23 crc kubenswrapper[4907]: E0226 15:45:23.257714 4907 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:45:23 crc kubenswrapper[4907]: E0226 15:45:23.257848 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs podName:fd06f422-2c09-4da9-843c-75525df52517 nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.257815042 +0000 UTC m=+249.776376931 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs") pod "network-metrics-daemon-zsb5l" (UID: "fd06f422-2c09-4da9-843c-75525df52517") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 15:45:24 crc kubenswrapper[4907]: I0226 15:45:24.126174 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:24 crc kubenswrapper[4907]: E0226 15:45:24.126447 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:24 crc kubenswrapper[4907]: I0226 15:45:24.126809 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:24 crc kubenswrapper[4907]: I0226 15:45:24.128075 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:24 crc kubenswrapper[4907]: E0226 15:45:24.128235 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:24 crc kubenswrapper[4907]: E0226 15:45:24.128983 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:25 crc kubenswrapper[4907]: I0226 15:45:25.126082 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:25 crc kubenswrapper[4907]: E0226 15:45:25.126307 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:26 crc kubenswrapper[4907]: I0226 15:45:26.126662 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:26 crc kubenswrapper[4907]: I0226 15:45:26.126738 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:26 crc kubenswrapper[4907]: I0226 15:45:26.126994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:26 crc kubenswrapper[4907]: E0226 15:45:26.127977 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:26 crc kubenswrapper[4907]: E0226 15:45:26.128240 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:26 crc kubenswrapper[4907]: E0226 15:45:26.128829 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.125802 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.126324 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.273306 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.273370 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.273387 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.273412 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.273430 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:27Z","lastTransitionTime":"2026-02-26T15:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.293054 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.297140 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.297346 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.297517 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.297728 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.297921 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:27Z","lastTransitionTime":"2026-02-26T15:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.320333 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.325292 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.325527 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.325755 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.325954 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.326349 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:27Z","lastTransitionTime":"2026-02-26T15:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.348829 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.354136 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.354199 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.354220 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.354249 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.354265 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:27Z","lastTransitionTime":"2026-02-26T15:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.376738 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.382119 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.382176 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.382194 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.382216 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:27 crc kubenswrapper[4907]: I0226 15:45:27.382236 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:27Z","lastTransitionTime":"2026-02-26T15:45:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.403450 4907 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"7800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"24148068Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"8\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"24608868Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T15:45:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"16aec221-b9ec-4b79-ac12-986d05cb9b8b\\\",\\\"systemUUID\\\":\\\"7af7b453-01c3-4b8b-8c30-b1df8ce070ce\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:27Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:27 crc kubenswrapper[4907]: E0226 15:45:27.403716 4907 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.126666 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.126857 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:28 crc kubenswrapper[4907]: E0226 15:45:28.126900 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.126681 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:28 crc kubenswrapper[4907]: E0226 15:45:28.127140 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:28 crc kubenswrapper[4907]: E0226 15:45:28.127271 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.129862 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:45:28 crc kubenswrapper[4907]: E0226 15:45:28.130366 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 1m20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.153650 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"87fcecd2-771a-4669-a303-2f74cf7ac919\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa0a0c55e7d739a2c76f82d2886d67e4aa4334b873445cb317782b057f7afa65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba401e1eedaa38b967c1b76dc8ee8221684e36e0f152a24131706adc0346bb2d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e44c81cef61f4aecc15b45d6bbb7f3552588a1f0256042998c5a2f158c3879c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81ae0a80fac56ae4b446c60d3478f3b6e4a448314ac78ad45840c7c09c232f0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.188081 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"087bfdc5-a69f-41c0-912b-10827f34927b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d6cb50daf3d05a3e4b4427361206adaeb990478e437b697db9a2716fbc0a3e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a65767b486307851169c93586cffb785a0977b0ca654dc7bc6fd38ce349d5f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b642a813d8a9d885593d5dd495ed461119f14e1c1937844b64196bb55dd67e24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e03a798a371431d5f0e490e8ffe260ea101ae6a41f56f9ee2d37c2ed255f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d111022be1d13de640f2fe6f3683455c1defed82f3c06fb63c8b84d2feea1182\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3c8fe6e74e5efb27449fa26c2e705a62d8fb1b6f74e1ed787fbd7c37e711699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0933fb54ef30c16899ff47ed6fa9c7836452ad420e970de1c0b7408c0bb3886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca0bc422f98a960703843a6e090851bd3b091b08d31aeb875ef10cbb6e9c830a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T15:42:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.211002 4907 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e5aef55-fc68-4c1c-92e1-41a202917e84\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:43:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T15:42:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5033366771e6954e4bdd280702ad5d080a1306e8fbfa2e99a0221a3865c13ed6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62c4450c857a205706fb8639ca0bf473be68a81f8e70a989080e74e6fb9795c8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T15:42:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 15:42:20.262653 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 15:42:20.264750 1 observer_polling.go:159] Starting file observer\\\\nI0226 15:42:20.297295 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 15:42:20.301511 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0226 15:42:50.781187 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:42:49Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac01de0d4759557a4502a3c742ecae613068311f796904e35769463f9a277620\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e11dad962ef019f41cac623fb986f909a7c58377cd8d52e58ec300f7cc4cbb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T15:42:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T15:42:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T15:45:28Z is after 2025-08-24T17:21:41Z" Feb 26 15:45:28 crc kubenswrapper[4907]: E0226 15:45:28.238166 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.287156 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-s9f9w" podStartSLOduration=133.287132277 podStartE2EDuration="2m13.287132277s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.265993756 +0000 UTC m=+190.784555615" watchObservedRunningTime="2026-02-26 15:45:28.287132277 +0000 UTC m=+190.805694156" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.359913 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podStartSLOduration=133.359892751 podStartE2EDuration="2m13.359892751s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.359798398 +0000 UTC m=+190.878360277" watchObservedRunningTime="2026-02-26 15:45:28.359892751 +0000 UTC m=+190.878454610" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.360083 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b2qgz" podStartSLOduration=133.360077945 podStartE2EDuration="2m13.360077945s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.344718817 +0000 UTC m=+190.863280706" watchObservedRunningTime="2026-02-26 15:45:28.360077945 +0000 UTC m=+190.878639804" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.427929 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.427902088 podStartE2EDuration="24.427902088s" podCreationTimestamp="2026-02-26 15:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.408162033 +0000 UTC m=+190.926723892" watchObservedRunningTime="2026-02-26 15:45:28.427902088 +0000 UTC m=+190.946463947" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.461950 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9gtgp" podStartSLOduration=134.461923822 podStartE2EDuration="2m14.461923822s" podCreationTimestamp="2026-02-26 15:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.459315433 +0000 UTC m=+190.977877322" watchObservedRunningTime="2026-02-26 15:45:28.461923822 +0000 UTC m=+190.980485711" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.501439 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2gl5t" podStartSLOduration=133.501415191 podStartE2EDuration="2m13.501415191s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.475445122 +0000 UTC m=+190.994006981" watchObservedRunningTime="2026-02-26 15:45:28.501415191 +0000 UTC m=+191.019977060" Feb 26 15:45:28 crc kubenswrapper[4907]: I0226 15:45:28.534851 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-958vt" podStartSLOduration=134.53483208 podStartE2EDuration="2m14.53483208s" podCreationTimestamp="2026-02-26 15:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:28.534646545 +0000 UTC m=+191.053208424" watchObservedRunningTime="2026-02-26 15:45:28.53483208 +0000 UTC m=+191.053393929" Feb 26 15:45:29 crc kubenswrapper[4907]: I0226 15:45:29.125661 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:29 crc kubenswrapper[4907]: E0226 15:45:29.125905 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:30 crc kubenswrapper[4907]: I0226 15:45:30.126294 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:30 crc kubenswrapper[4907]: I0226 15:45:30.126441 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:30 crc kubenswrapper[4907]: I0226 15:45:30.126510 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:30 crc kubenswrapper[4907]: E0226 15:45:30.127108 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:30 crc kubenswrapper[4907]: E0226 15:45:30.127298 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:30 crc kubenswrapper[4907]: E0226 15:45:30.127436 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:31 crc kubenswrapper[4907]: I0226 15:45:31.126550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:31 crc kubenswrapper[4907]: E0226 15:45:31.126773 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:31 crc kubenswrapper[4907]: I0226 15:45:31.128122 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:45:31 crc kubenswrapper[4907]: E0226 15:45:31.128411 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:45:32 crc kubenswrapper[4907]: I0226 15:45:32.125834 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:32 crc kubenswrapper[4907]: E0226 15:45:32.125981 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:32 crc kubenswrapper[4907]: I0226 15:45:32.126069 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:32 crc kubenswrapper[4907]: E0226 15:45:32.126248 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:32 crc kubenswrapper[4907]: I0226 15:45:32.126427 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:32 crc kubenswrapper[4907]: E0226 15:45:32.126744 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:33 crc kubenswrapper[4907]: I0226 15:45:33.126210 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:33 crc kubenswrapper[4907]: E0226 15:45:33.126840 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:33 crc kubenswrapper[4907]: E0226 15:45:33.239473 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:34 crc kubenswrapper[4907]: I0226 15:45:34.126469 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:34 crc kubenswrapper[4907]: I0226 15:45:34.126512 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:34 crc kubenswrapper[4907]: I0226 15:45:34.126470 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:34 crc kubenswrapper[4907]: E0226 15:45:34.126661 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:34 crc kubenswrapper[4907]: E0226 15:45:34.126716 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:34 crc kubenswrapper[4907]: E0226 15:45:34.126800 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:35 crc kubenswrapper[4907]: I0226 15:45:35.126285 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:35 crc kubenswrapper[4907]: E0226 15:45:35.126833 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:36 crc kubenswrapper[4907]: I0226 15:45:36.125760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:36 crc kubenswrapper[4907]: I0226 15:45:36.125760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:36 crc kubenswrapper[4907]: E0226 15:45:36.125994 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:36 crc kubenswrapper[4907]: E0226 15:45:36.126141 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:36 crc kubenswrapper[4907]: I0226 15:45:36.126789 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:36 crc kubenswrapper[4907]: E0226 15:45:36.126996 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.126320 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:37 crc kubenswrapper[4907]: E0226 15:45:37.126534 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.504676 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.504999 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.505101 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.505203 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.505289 4907 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T15:45:37Z","lastTransitionTime":"2026-02-26T15:45:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.545072 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g"] Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.545624 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.549972 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.550360 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.550639 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.551330 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.562691 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.562658803 podStartE2EDuration="53.562658803s" podCreationTimestamp="2026-02-26 15:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:37.56252477 +0000 UTC m=+200.081086619" watchObservedRunningTime="2026-02-26 15:45:37.562658803 +0000 UTC m=+200.081220642" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.586876 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=28.586860007 podStartE2EDuration="28.586860007s" podCreationTimestamp="2026-02-26 15:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:37.586693233 +0000 UTC m=+200.105255082" watchObservedRunningTime="2026-02-26 15:45:37.586860007 +0000 UTC m=+200.105421856" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.603187 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=55.603169975 podStartE2EDuration="55.603169975s" podCreationTimestamp="2026-02-26 15:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:37.602385836 +0000 UTC m=+200.120947695" watchObservedRunningTime="2026-02-26 15:45:37.603169975 +0000 UTC m=+200.121731824" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.625486 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe3fd33-6423-410c-85dc-b040976b8eed-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.625544 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afe3fd33-6423-410c-85dc-b040976b8eed-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.625569 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/afe3fd33-6423-410c-85dc-b040976b8eed-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.625585 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/afe3fd33-6423-410c-85dc-b040976b8eed-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.625665 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/afe3fd33-6423-410c-85dc-b040976b8eed-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.727144 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/afe3fd33-6423-410c-85dc-b040976b8eed-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.727565 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/afe3fd33-6423-410c-85dc-b040976b8eed-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.728764 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/afe3fd33-6423-410c-85dc-b040976b8eed-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.727512 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/afe3fd33-6423-410c-85dc-b040976b8eed-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.728864 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe3fd33-6423-410c-85dc-b040976b8eed-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.728920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/afe3fd33-6423-410c-85dc-b040976b8eed-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.728946 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afe3fd33-6423-410c-85dc-b040976b8eed-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.729547 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/afe3fd33-6423-410c-85dc-b040976b8eed-service-ca\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.735280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afe3fd33-6423-410c-85dc-b040976b8eed-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.749396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afe3fd33-6423-410c-85dc-b040976b8eed-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-hv82g\" (UID: \"afe3fd33-6423-410c-85dc-b040976b8eed\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:37 crc kubenswrapper[4907]: I0226 15:45:37.860463 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.093175 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" event={"ID":"afe3fd33-6423-410c-85dc-b040976b8eed","Type":"ContainerStarted","Data":"0a25228a7142713736189df7335176791569cf383c8d0bf2fe0a2b2379dac1ac"} Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.093250 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" event={"ID":"afe3fd33-6423-410c-85dc-b040976b8eed","Type":"ContainerStarted","Data":"1d750e014735eec5366d50d5c29e6062135a891d3c8a5727de2013f9219d7892"} Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.117878 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-hv82g" podStartSLOduration=143.117849385 podStartE2EDuration="2m23.117849385s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:38.116469842 +0000 UTC m=+200.635031731" watchObservedRunningTime="2026-02-26 15:45:38.117849385 +0000 UTC m=+200.636411304" Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.126455 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:38 crc kubenswrapper[4907]: E0226 15:45:38.128326 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.128765 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.128842 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:38 crc kubenswrapper[4907]: E0226 15:45:38.128966 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:38 crc kubenswrapper[4907]: E0226 15:45:38.129292 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.198473 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 26 15:45:38 crc kubenswrapper[4907]: I0226 15:45:38.208968 4907 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 15:45:38 crc kubenswrapper[4907]: E0226 15:45:38.240493 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:39 crc kubenswrapper[4907]: I0226 15:45:39.125939 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:39 crc kubenswrapper[4907]: E0226 15:45:39.126145 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:40 crc kubenswrapper[4907]: I0226 15:45:40.125853 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:40 crc kubenswrapper[4907]: I0226 15:45:40.125907 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:40 crc kubenswrapper[4907]: E0226 15:45:40.126027 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:40 crc kubenswrapper[4907]: I0226 15:45:40.126060 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:40 crc kubenswrapper[4907]: E0226 15:45:40.126125 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:40 crc kubenswrapper[4907]: E0226 15:45:40.126192 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:41 crc kubenswrapper[4907]: I0226 15:45:41.127436 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:41 crc kubenswrapper[4907]: E0226 15:45:41.127579 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:42 crc kubenswrapper[4907]: I0226 15:45:42.125706 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:42 crc kubenswrapper[4907]: I0226 15:45:42.125831 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:42 crc kubenswrapper[4907]: I0226 15:45:42.125707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:42 crc kubenswrapper[4907]: E0226 15:45:42.125827 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:42 crc kubenswrapper[4907]: E0226 15:45:42.125898 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:42 crc kubenswrapper[4907]: E0226 15:45:42.126046 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:43 crc kubenswrapper[4907]: I0226 15:45:43.126273 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:43 crc kubenswrapper[4907]: E0226 15:45:43.126788 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:43 crc kubenswrapper[4907]: I0226 15:45:43.126935 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:45:43 crc kubenswrapper[4907]: E0226 15:45:43.241685 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.114697 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.116846 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7"} Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.117351 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.126175 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.126281 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:44 crc kubenswrapper[4907]: E0226 15:45:44.126421 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.126508 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:44 crc kubenswrapper[4907]: E0226 15:45:44.126724 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:44 crc kubenswrapper[4907]: E0226 15:45:44.126772 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.127768 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:45:44 crc kubenswrapper[4907]: E0226 15:45:44.128014 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:45:44 crc kubenswrapper[4907]: I0226 15:45:44.150617 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=87.15057572 podStartE2EDuration="1m27.15057572s" podCreationTimestamp="2026-02-26 15:44:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:45:44.150343585 +0000 UTC m=+206.668905444" watchObservedRunningTime="2026-02-26 15:45:44.15057572 +0000 UTC m=+206.669137609" Feb 26 15:45:45 crc kubenswrapper[4907]: I0226 15:45:45.126497 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:45 crc kubenswrapper[4907]: E0226 15:45:45.126767 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:46 crc kubenswrapper[4907]: I0226 15:45:46.126270 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:46 crc kubenswrapper[4907]: I0226 15:45:46.126357 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:46 crc kubenswrapper[4907]: I0226 15:45:46.126288 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:46 crc kubenswrapper[4907]: E0226 15:45:46.126475 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:46 crc kubenswrapper[4907]: E0226 15:45:46.126666 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:46 crc kubenswrapper[4907]: E0226 15:45:46.126820 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:47 crc kubenswrapper[4907]: I0226 15:45:47.125814 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:47 crc kubenswrapper[4907]: E0226 15:45:47.126035 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:48 crc kubenswrapper[4907]: I0226 15:45:48.125662 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:48 crc kubenswrapper[4907]: I0226 15:45:48.125667 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:48 crc kubenswrapper[4907]: I0226 15:45:48.125734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:48 crc kubenswrapper[4907]: E0226 15:45:48.127939 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:48 crc kubenswrapper[4907]: E0226 15:45:48.128171 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:48 crc kubenswrapper[4907]: E0226 15:45:48.128254 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:48 crc kubenswrapper[4907]: E0226 15:45:48.242726 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:49 crc kubenswrapper[4907]: I0226 15:45:49.126344 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:49 crc kubenswrapper[4907]: E0226 15:45:49.126879 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:50 crc kubenswrapper[4907]: I0226 15:45:50.125720 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:50 crc kubenswrapper[4907]: I0226 15:45:50.125759 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:50 crc kubenswrapper[4907]: E0226 15:45:50.125865 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:50 crc kubenswrapper[4907]: E0226 15:45:50.125938 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:50 crc kubenswrapper[4907]: I0226 15:45:50.125751 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:50 crc kubenswrapper[4907]: E0226 15:45:50.126440 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:51 crc kubenswrapper[4907]: I0226 15:45:51.125859 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:51 crc kubenswrapper[4907]: E0226 15:45:51.126108 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:52 crc kubenswrapper[4907]: I0226 15:45:52.125975 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:52 crc kubenswrapper[4907]: I0226 15:45:52.125986 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:52 crc kubenswrapper[4907]: E0226 15:45:52.126411 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:52 crc kubenswrapper[4907]: I0226 15:45:52.126761 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:52 crc kubenswrapper[4907]: E0226 15:45:52.126289 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:52 crc kubenswrapper[4907]: E0226 15:45:52.126952 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.125703 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:53 crc kubenswrapper[4907]: E0226 15:45:53.125906 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.150651 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/1.log" Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.151486 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/0.log" Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.151574 4907 generic.go:334] "Generic (PLEG): container finished" podID="51024bd5-00ff-4e2f-927c-8c989b59d7be" containerID="e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f" exitCode=1 Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.151666 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerDied","Data":"e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f"} Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.151740 4907 scope.go:117] "RemoveContainer" containerID="9a3cdc02208e8eab1e0c3c3f08a0759873ebfd63c98e64af187800d59a5b44da" Feb 26 15:45:53 crc kubenswrapper[4907]: I0226 15:45:53.152392 4907 scope.go:117] "RemoveContainer" containerID="e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f" Feb 26 15:45:53 crc kubenswrapper[4907]: E0226 15:45:53.152744 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-2gl5t_openshift-multus(51024bd5-00ff-4e2f-927c-8c989b59d7be)\"" pod="openshift-multus/multus-2gl5t" podUID="51024bd5-00ff-4e2f-927c-8c989b59d7be" Feb 26 15:45:53 crc kubenswrapper[4907]: E0226 15:45:53.244541 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:54 crc kubenswrapper[4907]: I0226 15:45:54.125868 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:54 crc kubenswrapper[4907]: I0226 15:45:54.125887 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:54 crc kubenswrapper[4907]: I0226 15:45:54.125887 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:54 crc kubenswrapper[4907]: E0226 15:45:54.126261 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:54 crc kubenswrapper[4907]: E0226 15:45:54.126066 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:54 crc kubenswrapper[4907]: E0226 15:45:54.126707 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:54 crc kubenswrapper[4907]: I0226 15:45:54.157909 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/1.log" Feb 26 15:45:55 crc kubenswrapper[4907]: I0226 15:45:55.126504 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:55 crc kubenswrapper[4907]: E0226 15:45:55.126774 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:56 crc kubenswrapper[4907]: I0226 15:45:56.126072 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:56 crc kubenswrapper[4907]: I0226 15:45:56.126110 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:56 crc kubenswrapper[4907]: I0226 15:45:56.126265 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:56 crc kubenswrapper[4907]: E0226 15:45:56.126351 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:56 crc kubenswrapper[4907]: E0226 15:45:56.126527 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:56 crc kubenswrapper[4907]: E0226 15:45:56.127386 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:56 crc kubenswrapper[4907]: I0226 15:45:56.127838 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:45:56 crc kubenswrapper[4907]: E0226 15:45:56.128172 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-vsvsw_openshift-ovn-kubernetes(49ee65e1-8667-4ad7-a403-c899f0cc6a70)\"" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" Feb 26 15:45:57 crc kubenswrapper[4907]: I0226 15:45:57.125954 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:57 crc kubenswrapper[4907]: E0226 15:45:57.126154 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:45:57 crc kubenswrapper[4907]: I0226 15:45:57.163673 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:45:58 crc kubenswrapper[4907]: I0226 15:45:58.126408 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:45:58 crc kubenswrapper[4907]: I0226 15:45:58.126486 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:45:58 crc kubenswrapper[4907]: I0226 15:45:58.126451 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:45:58 crc kubenswrapper[4907]: E0226 15:45:58.129914 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:45:58 crc kubenswrapper[4907]: E0226 15:45:58.129807 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:45:58 crc kubenswrapper[4907]: E0226 15:45:58.130170 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:45:58 crc kubenswrapper[4907]: E0226 15:45:58.245344 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:45:59 crc kubenswrapper[4907]: I0226 15:45:59.126367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:45:59 crc kubenswrapper[4907]: E0226 15:45:59.126639 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:00 crc kubenswrapper[4907]: I0226 15:46:00.129926 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:00 crc kubenswrapper[4907]: E0226 15:46:00.130061 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:00 crc kubenswrapper[4907]: I0226 15:46:00.129935 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:00 crc kubenswrapper[4907]: E0226 15:46:00.130154 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:00 crc kubenswrapper[4907]: I0226 15:46:00.130408 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:00 crc kubenswrapper[4907]: E0226 15:46:00.130467 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:01 crc kubenswrapper[4907]: I0226 15:46:01.126046 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:01 crc kubenswrapper[4907]: E0226 15:46:01.126272 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:02 crc kubenswrapper[4907]: I0226 15:46:02.126555 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:02 crc kubenswrapper[4907]: I0226 15:46:02.126693 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:02 crc kubenswrapper[4907]: I0226 15:46:02.126630 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:02 crc kubenswrapper[4907]: E0226 15:46:02.126870 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:02 crc kubenswrapper[4907]: E0226 15:46:02.126986 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:02 crc kubenswrapper[4907]: E0226 15:46:02.127100 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:03 crc kubenswrapper[4907]: I0226 15:46:03.126286 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:03 crc kubenswrapper[4907]: E0226 15:46:03.126491 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:03 crc kubenswrapper[4907]: E0226 15:46:03.246353 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:46:04 crc kubenswrapper[4907]: I0226 15:46:04.126113 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:04 crc kubenswrapper[4907]: I0226 15:46:04.126145 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:04 crc kubenswrapper[4907]: E0226 15:46:04.126377 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:04 crc kubenswrapper[4907]: I0226 15:46:04.126362 4907 scope.go:117] "RemoveContainer" containerID="e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f" Feb 26 15:46:04 crc kubenswrapper[4907]: I0226 15:46:04.126174 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:04 crc kubenswrapper[4907]: E0226 15:46:04.126767 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:04 crc kubenswrapper[4907]: E0226 15:46:04.126774 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:05 crc kubenswrapper[4907]: I0226 15:46:05.126267 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:05 crc kubenswrapper[4907]: E0226 15:46:05.126794 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:05 crc kubenswrapper[4907]: I0226 15:46:05.206891 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/1.log" Feb 26 15:46:05 crc kubenswrapper[4907]: I0226 15:46:05.206966 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerStarted","Data":"7485dceccdb2068136cd7e452af5b857fbf4a0321439464c6d537dffff0f08bb"} Feb 26 15:46:06 crc kubenswrapper[4907]: I0226 15:46:06.126017 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:06 crc kubenswrapper[4907]: I0226 15:46:06.126076 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:06 crc kubenswrapper[4907]: I0226 15:46:06.126103 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:06 crc kubenswrapper[4907]: E0226 15:46:06.126219 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:06 crc kubenswrapper[4907]: E0226 15:46:06.126341 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:06 crc kubenswrapper[4907]: E0226 15:46:06.126436 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:07 crc kubenswrapper[4907]: I0226 15:46:07.126403 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:07 crc kubenswrapper[4907]: E0226 15:46:07.126655 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:08 crc kubenswrapper[4907]: I0226 15:46:08.126209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:08 crc kubenswrapper[4907]: I0226 15:46:08.126739 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:08 crc kubenswrapper[4907]: I0226 15:46:08.128409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:08 crc kubenswrapper[4907]: E0226 15:46:08.128565 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:08 crc kubenswrapper[4907]: E0226 15:46:08.129102 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:08 crc kubenswrapper[4907]: E0226 15:46:08.129358 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:08 crc kubenswrapper[4907]: E0226 15:46:08.247745 4907 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 15:46:09 crc kubenswrapper[4907]: I0226 15:46:09.126200 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:09 crc kubenswrapper[4907]: E0226 15:46:09.126951 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:09 crc kubenswrapper[4907]: I0226 15:46:09.126977 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.125762 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.125787 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:10 crc kubenswrapper[4907]: E0226 15:46:10.125890 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.125943 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:10 crc kubenswrapper[4907]: E0226 15:46:10.125998 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:10 crc kubenswrapper[4907]: E0226 15:46:10.126077 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.223675 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/3.log" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.226188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerStarted","Data":"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3"} Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.226583 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.252874 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podStartSLOduration=175.252856667 podStartE2EDuration="2m55.252856667s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:10.251784252 +0000 UTC m=+232.770346101" watchObservedRunningTime="2026-02-26 15:46:10.252856667 +0000 UTC m=+232.771418516" Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.412864 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zsb5l"] Feb 26 15:46:10 crc kubenswrapper[4907]: I0226 15:46:10.413034 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:10 crc kubenswrapper[4907]: E0226 15:46:10.413210 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:12 crc kubenswrapper[4907]: I0226 15:46:12.125653 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:12 crc kubenswrapper[4907]: E0226 15:46:12.126129 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 15:46:12 crc kubenswrapper[4907]: I0226 15:46:12.125712 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:12 crc kubenswrapper[4907]: E0226 15:46:12.126394 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 15:46:12 crc kubenswrapper[4907]: I0226 15:46:12.125837 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:12 crc kubenswrapper[4907]: E0226 15:46:12.126487 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zsb5l" podUID="fd06f422-2c09-4da9-843c-75525df52517" Feb 26 15:46:12 crc kubenswrapper[4907]: I0226 15:46:12.125776 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:12 crc kubenswrapper[4907]: E0226 15:46:12.126714 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.126662 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.126750 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.126750 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.127239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.129708 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.129752 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.129818 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.130340 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.130531 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 15:46:14 crc kubenswrapper[4907]: I0226 15:46:14.135679 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.188519 4907 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.241733 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.242695 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.244297 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p9vbb"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.244896 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.246875 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hdqkt"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.247620 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.250191 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.250274 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.250484 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.250533 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.251195 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.251698 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.251748 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.251912 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.252082 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.256223 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.256570 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.256911 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5tc4m"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.258165 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.256947 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.259675 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.259829 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.260301 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.257115 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.261810 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.262184 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.262496 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.263251 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr7kc"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.263553 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.263993 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.264259 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.264492 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.265009 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.265314 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.268516 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.270189 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.272828 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.273208 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.273667 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.273875 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.274028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.276573 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.277239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.277826 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.283785 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.284018 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.295769 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.298331 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.298564 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.300882 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.301353 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.325760 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.325933 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.326009 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.326074 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.311375 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.326452 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.327735 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sjflz"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328123 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328237 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328351 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9lx5z"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328497 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328715 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328817 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328842 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.328942 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.329009 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.329487 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.329970 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330231 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/774f91c0-0433-43a5-8b33-18a5253ba0a3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330547 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-config\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330567 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330586 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-config\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330621 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m5mf\" (UniqueName: \"kubernetes.io/projected/f1a111d0-85de-4328-90ac-9b9af3edbc49-kube-api-access-2m5mf\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330637 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330652 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330666 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzv9t\" (UniqueName: \"kubernetes.io/projected/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-kube-api-access-lzv9t\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330683 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489d8c16-01bf-466b-a863-a3c8594d8b88-config\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330696 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vwj\" (UniqueName: \"kubernetes.io/projected/54942a44-6e66-4757-8106-bbe836a2d8ca-kube-api-access-46vwj\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330710 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-audit\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330728 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmbrj\" (UniqueName: \"kubernetes.io/projected/774f91c0-0433-43a5-8b33-18a5253ba0a3-kube-api-access-nmbrj\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330744 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-audit-policies\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330757 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-serving-cert\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330771 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72gcp\" (UniqueName: \"kubernetes.io/projected/1dc6224f-2bf9-4c28-a6df-30a177430c08-kube-api-access-72gcp\") pod \"cluster-samples-operator-665b6dd947-xvqkl\" (UID: \"1dc6224f-2bf9-4c28-a6df-30a177430c08\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330796 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gf4p\" (UniqueName: \"kubernetes.io/projected/0383e657-c434-43b2-878b-314ce5a2339e-kube-api-access-9gf4p\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330811 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-dir\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330851 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1a111d0-85de-4328-90ac-9b9af3edbc49-node-pullsecrets\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330866 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-client-ca\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330881 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-etcd-serving-ca\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330918 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8rz5\" (UniqueName: \"kubernetes.io/projected/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-kube-api-access-p8rz5\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330951 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-audit-dir\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330966 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330981 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-client-ca\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.330994 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-image-import-ca\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331008 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-auth-proxy-config\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331023 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/489d8c16-01bf-466b-a863-a3c8594d8b88-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331043 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331063 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcg8q\" (UniqueName: \"kubernetes.io/projected/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-kube-api-access-wcg8q\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331087 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0383e657-c434-43b2-878b-314ce5a2339e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331102 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-config\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331116 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-serving-cert\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-encryption-config\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331159 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/489d8c16-01bf-466b-a863-a3c8594d8b88-images\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331178 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331196 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331219 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdtmp\" (UniqueName: \"kubernetes.io/projected/bbef2e1f-1be1-4624-804c-45892231df1e-kube-api-access-vdtmp\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54942a44-6e66-4757-8106-bbe836a2d8ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331251 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-config\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331268 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331284 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0383e657-c434-43b2-878b-314ce5a2339e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331291 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331298 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331313 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-etcd-client\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331328 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331342 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1a111d0-85de-4328-90ac-9b9af3edbc49-audit-dir\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331361 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774f91c0-0433-43a5-8b33-18a5253ba0a3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331375 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-etcd-client\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-machine-approver-tls\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331422 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1dc6224f-2bf9-4c28-a6df-30a177430c08-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xvqkl\" (UID: \"1dc6224f-2bf9-4c28-a6df-30a177430c08\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnc2d\" (UniqueName: \"kubernetes.io/projected/489d8c16-01bf-466b-a863-a3c8594d8b88-kube-api-access-hnc2d\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331453 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbef2e1f-1be1-4624-804c-45892231df1e-serving-cert\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331470 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331482 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331570 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331485 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-encryption-config\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331705 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-policies\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.331737 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.335510 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.336096 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.336194 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.336271 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.336347 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.336570 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.337493 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.338010 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.338175 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.338306 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.338556 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.338789 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.338939 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.340373 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.340533 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.340958 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.341834 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.342038 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.342174 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.342266 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.342284 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wtjfv"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.342734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.346621 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.346698 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.346878 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.346958 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.347105 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.347185 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.347252 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.347727 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.348016 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wcgj6"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.348503 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.350876 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tvpcl"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.351417 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.353177 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.353357 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.353417 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.353642 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.358125 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.358300 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.358541 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.358542 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.358779 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.359086 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.359526 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.360800 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.373856 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.374121 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.378016 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.380583 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.381626 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.382065 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8wmgt"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.382429 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.383249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.384286 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.405665 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.410944 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.410982 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.411251 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.411482 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.411871 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.412844 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.412978 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.416653 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.416713 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kqtml"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.417294 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dvcn5"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.417682 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.417769 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.417799 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.417909 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.418148 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.424707 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.425064 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.425291 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.425618 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.425726 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.428669 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.432206 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-service-ca-bundle\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.432247 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0383e657-c434-43b2-878b-314ce5a2339e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.433351 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-config\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.433382 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-serving-cert\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.433399 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-encryption-config\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.433432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lclpq\" (UniqueName: \"kubernetes.io/projected/ef4c8a6a-c008-406e-8aed-2164e582f710-kube-api-access-lclpq\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.433472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/489d8c16-01bf-466b-a863-a3c8594d8b88-images\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.435031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-config\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436212 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-config\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436239 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-service-ca\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436260 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436300 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c8a6a-c008-406e-8aed-2164e582f710-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436340 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdtmp\" (UniqueName: \"kubernetes.io/projected/bbef2e1f-1be1-4624-804c-45892231df1e-kube-api-access-vdtmp\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436374 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54942a44-6e66-4757-8106-bbe836a2d8ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436396 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-config\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436412 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbmc\" (UniqueName: \"kubernetes.io/projected/2e969445-2d6b-4ea1-bd4b-3473a66e8c91-kube-api-access-rfbmc\") pod \"downloads-7954f5f757-wcgj6\" (UID: \"2e969445-2d6b-4ea1-bd4b-3473a66e8c91\") " pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436451 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef4c8a6a-c008-406e-8aed-2164e582f710-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436467 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-oauth-serving-cert\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436484 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0383e657-c434-43b2-878b-314ce5a2339e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436538 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1a111d0-85de-4328-90ac-9b9af3edbc49-audit-dir\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436556 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-etcd-client\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774f91c0-0433-43a5-8b33-18a5253ba0a3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436611 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436627 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-etcd-client\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-machine-approver-tls\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436659 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1dc6224f-2bf9-4c28-a6df-30a177430c08-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xvqkl\" (UID: \"1dc6224f-2bf9-4c28-a6df-30a177430c08\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436678 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-serving-cert\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnc2d\" (UniqueName: \"kubernetes.io/projected/489d8c16-01bf-466b-a863-a3c8594d8b88-kube-api-access-hnc2d\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436713 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-oauth-config\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436730 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbef2e1f-1be1-4624-804c-45892231df1e-serving-cert\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436745 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436763 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-policies\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436779 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-encryption-config\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/774f91c0-0433-43a5-8b33-18a5253ba0a3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436809 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-config\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436828 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-config\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436859 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m5mf\" (UniqueName: \"kubernetes.io/projected/f1a111d0-85de-4328-90ac-9b9af3edbc49-kube-api-access-2m5mf\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436875 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436891 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzv9t\" (UniqueName: \"kubernetes.io/projected/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-kube-api-access-lzv9t\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436922 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489d8c16-01bf-466b-a863-a3c8594d8b88-config\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436939 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vwj\" (UniqueName: \"kubernetes.io/projected/54942a44-6e66-4757-8106-bbe836a2d8ca-kube-api-access-46vwj\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436953 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-audit\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436970 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef4c8a6a-c008-406e-8aed-2164e582f710-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.436987 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f4c2\" (UniqueName: \"kubernetes.io/projected/36952148-e6b5-4c20-8016-3de7f571420e-kube-api-access-8f4c2\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.437002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-audit-policies\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.437016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-serving-cert\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440706 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72gcp\" (UniqueName: \"kubernetes.io/projected/1dc6224f-2bf9-4c28-a6df-30a177430c08-kube-api-access-72gcp\") pod \"cluster-samples-operator-665b6dd947-xvqkl\" (UID: \"1dc6224f-2bf9-4c28-a6df-30a177430c08\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440738 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpn27\" (UniqueName: \"kubernetes.io/projected/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-kube-api-access-cpn27\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440758 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmbrj\" (UniqueName: \"kubernetes.io/projected/774f91c0-0433-43a5-8b33-18a5253ba0a3-kube-api-access-nmbrj\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gf4p\" (UniqueName: \"kubernetes.io/projected/0383e657-c434-43b2-878b-314ce5a2339e-kube-api-access-9gf4p\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440798 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440838 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-dir\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36952148-e6b5-4c20-8016-3de7f571420e-serving-cert\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1a111d0-85de-4328-90ac-9b9af3edbc49-node-pullsecrets\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-client-ca\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440937 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-etcd-serving-ca\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440981 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.440998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8rz5\" (UniqueName: \"kubernetes.io/projected/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-kube-api-access-p8rz5\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441022 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441038 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-client-ca\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441054 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-image-import-ca\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441070 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-auth-proxy-config\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-audit-dir\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441103 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/489d8c16-01bf-466b-a863-a3c8594d8b88-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441118 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441136 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441152 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-config\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441172 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcg8q\" (UniqueName: \"kubernetes.io/projected/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-kube-api-access-wcg8q\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441191 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-trusted-ca-bundle\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.441951 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1a111d0-85de-4328-90ac-9b9af3edbc49-audit-dir\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.443901 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.444450 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/489d8c16-01bf-466b-a863-a3c8594d8b88-images\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.445736 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.446377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-serving-cert\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.446402 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.446936 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54942a44-6e66-4757-8106-bbe836a2d8ca-serving-cert\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.447505 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-etcd-serving-ca\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.448372 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.448395 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.448631 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-config\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.448887 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.449031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-audit-policies\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.449200 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-hqs2t"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.449582 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.449918 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-config\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.450027 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.451819 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.452791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-config\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.453892 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/0383e657-c434-43b2-878b-314ce5a2339e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.454301 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.454377 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.454804 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-serving-cert\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.454905 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/489d8c16-01bf-466b-a863-a3c8594d8b88-config\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455152 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455357 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.454841 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455614 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-audit\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-dir\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455948 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f1a111d0-85de-4328-90ac-9b9af3edbc49-node-pullsecrets\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.456054 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-etcd-client\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.456169 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fd8f2"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.456240 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-auth-proxy-config\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.456578 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-client-ca\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.456745 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.457196 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.457498 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.457655 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-audit-dir\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.457796 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.459075 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.459296 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.455381 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.459549 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.459804 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/774f91c0-0433-43a5-8b33-18a5253ba0a3-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.459931 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.460147 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.460331 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.461202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.461570 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1a111d0-85de-4328-90ac-9b9af3edbc49-encryption-config\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.461857 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.462346 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.462773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-machine-approver-tls\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.463302 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.464432 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f1a111d0-85de-4328-90ac-9b9af3edbc49-image-import-ca\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.465049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-policies\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.465648 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/774f91c0-0433-43a5-8b33-18a5253ba0a3-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.466009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-etcd-client\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.466051 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535346-hhrww"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.466583 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.466950 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p9vbb"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.467373 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.467866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-encryption-config\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.468701 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.469293 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.469887 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/489d8c16-01bf-466b-a863-a3c8594d8b88-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.469078 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.471071 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1dc6224f-2bf9-4c28-a6df-30a177430c08-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xvqkl\" (UID: \"1dc6224f-2bf9-4c28-a6df-30a177430c08\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.471381 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.475663 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbef2e1f-1be1-4624-804c-45892231df1e-serving-cert\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.476102 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.476792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-client-ca\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.483396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0383e657-c434-43b2-878b-314ce5a2339e-serving-cert\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.484402 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.489245 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hdqkt"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.489280 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5rgk"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.491057 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535344-fsndq"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.491279 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.491660 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.492986 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.494571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.494878 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5tc4m"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.494901 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.495117 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.496856 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.496891 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.498197 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr7kc"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.498222 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wtjfv"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.498236 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.498350 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.499168 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.500259 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wcgj6"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.505329 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.508957 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.512044 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sjflz"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.515961 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.517958 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.521663 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6g628"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.528959 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6g628" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.533820 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fd8f2"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.534760 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.534041 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.539033 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.539446 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.539528 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef4c8a6a-c008-406e-8aed-2164e582f710-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f4c2\" (UniqueName: \"kubernetes.io/projected/36952148-e6b5-4c20-8016-3de7f571420e-kube-api-access-8f4c2\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543829 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpn27\" (UniqueName: \"kubernetes.io/projected/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-kube-api-access-cpn27\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543866 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36952148-e6b5-4c20-8016-3de7f571420e-serving-cert\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543867 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543897 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543916 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-config\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543937 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-trusted-ca-bundle\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543956 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-service-ca-bundle\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.543983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclpq\" (UniqueName: \"kubernetes.io/projected/ef4c8a6a-c008-406e-8aed-2164e582f710-kube-api-access-lclpq\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-config\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544021 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-service-ca\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544037 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c8a6a-c008-406e-8aed-2164e582f710-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544070 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbmc\" (UniqueName: \"kubernetes.io/projected/2e969445-2d6b-4ea1-bd4b-3473a66e8c91-kube-api-access-rfbmc\") pod \"downloads-7954f5f757-wcgj6\" (UID: \"2e969445-2d6b-4ea1-bd4b-3473a66e8c91\") " pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544087 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef4c8a6a-c008-406e-8aed-2164e582f710-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-oauth-serving-cert\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544129 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-serving-cert\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544152 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-oauth-config\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.544883 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-service-ca-bundle\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.545132 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.545232 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-service-ca\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.545271 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-oauth-serving-cert\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.545748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-config\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.546706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-config\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.547774 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-trusted-ca-bundle\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.547943 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36952148-e6b5-4c20-8016-3de7f571420e-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.548033 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-oauth-config\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.550516 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-serving-cert\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.551087 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.552668 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.553979 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.555049 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.556288 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.557068 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36952148-e6b5-4c20-8016-3de7f571420e-serving-cert\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.557621 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.559227 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dvcn5"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.560649 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kqtml"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.561710 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.562819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tvpcl"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.564323 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.565495 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.567061 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8wmgt"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.567793 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.569021 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9lx5z"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.570124 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.571365 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.572430 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-fsndq"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.573504 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5rgk"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.574729 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.575781 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.576771 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.577857 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.579287 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-hhrww"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.580441 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6g628"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.582530 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hs7mv"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.585195 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hs7mv"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.585214 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-k8lkj"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.585607 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.585871 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.586121 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l5fqj"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.587402 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.587545 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l5fqj"] Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.592655 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.597567 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4c8a6a-c008-406e-8aed-2164e582f710-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.606702 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.624957 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.631264 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef4c8a6a-c008-406e-8aed-2164e582f710-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.645077 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.664836 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.684448 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.704681 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.725260 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.744990 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.764561 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.790519 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.824885 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.844741 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.864931 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.884529 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.905246 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.972874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdtmp\" (UniqueName: \"kubernetes.io/projected/bbef2e1f-1be1-4624-804c-45892231df1e-kube-api-access-vdtmp\") pod \"controller-manager-879f6c89f-p9vbb\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:18 crc kubenswrapper[4907]: I0226 15:46:18.978860 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m5mf\" (UniqueName: \"kubernetes.io/projected/f1a111d0-85de-4328-90ac-9b9af3edbc49-kube-api-access-2m5mf\") pod \"apiserver-76f77b778f-5tc4m\" (UID: \"f1a111d0-85de-4328-90ac-9b9af3edbc49\") " pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.001748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzv9t\" (UniqueName: \"kubernetes.io/projected/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-kube-api-access-lzv9t\") pod \"oauth-openshift-558db77b4-lr7kc\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.005357 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.032217 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.041267 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72gcp\" (UniqueName: \"kubernetes.io/projected/1dc6224f-2bf9-4c28-a6df-30a177430c08-kube-api-access-72gcp\") pod \"cluster-samples-operator-665b6dd947-xvqkl\" (UID: \"1dc6224f-2bf9-4c28-a6df-30a177430c08\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.070240 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmbrj\" (UniqueName: \"kubernetes.io/projected/774f91c0-0433-43a5-8b33-18a5253ba0a3-kube-api-access-nmbrj\") pod \"openshift-apiserver-operator-796bbdcf4f-wgl2p\" (UID: \"774f91c0-0433-43a5-8b33-18a5253ba0a3\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.082004 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gf4p\" (UniqueName: \"kubernetes.io/projected/0383e657-c434-43b2-878b-314ce5a2339e-kube-api-access-9gf4p\") pod \"openshift-config-operator-7777fb866f-2f4gk\" (UID: \"0383e657-c434-43b2-878b-314ce5a2339e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.085781 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.095465 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.116298 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.125106 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.136581 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vwj\" (UniqueName: \"kubernetes.io/projected/54942a44-6e66-4757-8106-bbe836a2d8ca-kube-api-access-46vwj\") pod \"route-controller-manager-6576b87f9c-96swm\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.146302 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.148775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.162131 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.165552 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.186393 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.201520 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.206753 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.224965 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.243508 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.244872 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.265965 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.286293 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.297179 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5tc4m"] Feb 26 15:46:19 crc kubenswrapper[4907]: W0226 15:46:19.319014 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a111d0_85de_4328_90ac_9b9af3edbc49.slice/crio-c431076fcfc0f87431e7bb611b688b34afebfc7a2d5350b0ea39f243068e003c WatchSource:0}: Error finding container c431076fcfc0f87431e7bb611b688b34afebfc7a2d5350b0ea39f243068e003c: Status 404 returned error can't find the container with id c431076fcfc0f87431e7bb611b688b34afebfc7a2d5350b0ea39f243068e003c Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.323152 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnc2d\" (UniqueName: \"kubernetes.io/projected/489d8c16-01bf-466b-a863-a3c8594d8b88-kube-api-access-hnc2d\") pod \"machine-api-operator-5694c8668f-hdqkt\" (UID: \"489d8c16-01bf-466b-a863-a3c8594d8b88\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.325827 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.335610 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl"] Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.347052 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.370116 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.378175 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr7kc"] Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.386198 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.387783 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk"] Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.404697 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.407501 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p"] Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.426919 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: W0226 15:46:19.427106 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774f91c0_0433_43a5_8b33_18a5253ba0a3.slice/crio-fda6cd33b353ebc8bb6ad7b7bf73e7a1c3024b76e4911b2c1b903c911e7df687 WatchSource:0}: Error finding container fda6cd33b353ebc8bb6ad7b7bf73e7a1c3024b76e4911b2c1b903c911e7df687: Status 404 returned error can't find the container with id fda6cd33b353ebc8bb6ad7b7bf73e7a1c3024b76e4911b2c1b903c911e7df687 Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.463166 4907 request.go:700] Waited for 1.002831553s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmultus-admission-controller-secret&limit=500&resourceVersion=0 Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.464983 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.473633 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8rz5\" (UniqueName: \"kubernetes.io/projected/05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7-kube-api-access-p8rz5\") pod \"machine-approver-56656f9798-m7jwg\" (UID: \"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.485680 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.506713 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.525427 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.554426 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.565522 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.582944 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.585225 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.606412 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.625049 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.645674 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.665688 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.687205 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.698573 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p9vbb"] Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.706190 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.721085 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm"] Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.725126 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.734563 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.744475 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: W0226 15:46:19.746132 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54942a44_6e66_4757_8106_bbe836a2d8ca.slice/crio-82fead9ff16b326232a8cf7cec57cfb267f152ad0e74601d2af4a4c7cacd110d WatchSource:0}: Error finding container 82fead9ff16b326232a8cf7cec57cfb267f152ad0e74601d2af4a4c7cacd110d: Status 404 returned error can't find the container with id 82fead9ff16b326232a8cf7cec57cfb267f152ad0e74601d2af4a4c7cacd110d Feb 26 15:46:19 crc kubenswrapper[4907]: W0226 15:46:19.747580 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbef2e1f_1be1_4624_804c_45892231df1e.slice/crio-cd42c579be0f111294d33e7a5d28454d1b6907a75ca9a4d06696d89d68848920 WatchSource:0}: Error finding container cd42c579be0f111294d33e7a5d28454d1b6907a75ca9a4d06696d89d68848920: Status 404 returned error can't find the container with id cd42c579be0f111294d33e7a5d28454d1b6907a75ca9a4d06696d89d68848920 Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.766314 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.785098 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.804747 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.808264 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hdqkt"] Feb 26 15:46:19 crc kubenswrapper[4907]: W0226 15:46:19.821340 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod489d8c16_01bf_466b_a863_a3c8594d8b88.slice/crio-4f6d67b600411553a6794f30d505508f6678d9184e1f53ded8de88dfa54cb691 WatchSource:0}: Error finding container 4f6d67b600411553a6794f30d505508f6678d9184e1f53ded8de88dfa54cb691: Status 404 returned error can't find the container with id 4f6d67b600411553a6794f30d505508f6678d9184e1f53ded8de88dfa54cb691 Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.825724 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.856700 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.864706 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.884775 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.928396 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.934133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcg8q\" (UniqueName: \"kubernetes.io/projected/1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4-kube-api-access-wcg8q\") pod \"apiserver-7bbb656c7d-vnrdg\" (UID: \"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.946906 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.965957 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 15:46:19 crc kubenswrapper[4907]: I0226 15:46:19.985432 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.003153 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.005351 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.024789 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.045313 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.069799 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.086023 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.105278 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.126442 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.146088 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.165249 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.186915 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.191721 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg"] Feb 26 15:46:20 crc kubenswrapper[4907]: W0226 15:46:20.197304 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c26ef74_f7b8_4cc3_ae04_783bfa2b38b4.slice/crio-2db5bf494c35caae9b48fdc4a0eb9eed08a28580ef3a9e57fd25d25616f37465 WatchSource:0}: Error finding container 2db5bf494c35caae9b48fdc4a0eb9eed08a28580ef3a9e57fd25d25616f37465: Status 404 returned error can't find the container with id 2db5bf494c35caae9b48fdc4a0eb9eed08a28580ef3a9e57fd25d25616f37465 Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.204267 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.225446 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.245246 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.264935 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.279074 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" event={"ID":"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6","Type":"ContainerStarted","Data":"e7a064d46f10da05acc9a52ec9b08660db6497072fc921c6fcb4b4f75a91b427"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.279118 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" event={"ID":"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6","Type":"ContainerStarted","Data":"76f1c57e1232c564d9ac0fc7831515e08af8edd2563f06b9c8cfa68af813c51e"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.279426 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.281395 4907 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lr7kc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" start-of-body= Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.281444 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" podUID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.10:6443/healthz\": dial tcp 10.217.0.10:6443: connect: connection refused" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.282546 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" event={"ID":"489d8c16-01bf-466b-a863-a3c8594d8b88","Type":"ContainerStarted","Data":"1384eea979a4feb244281d6c2066e6bf4c2fc045c17fbbf7085674a22f399a85"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.282584 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" event={"ID":"489d8c16-01bf-466b-a863-a3c8594d8b88","Type":"ContainerStarted","Data":"01fb3f9b59d4512f02352b62afe1a8e4bb308ed9ceb3b9f38eae64701208cfd6"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.282635 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" event={"ID":"489d8c16-01bf-466b-a863-a3c8594d8b88","Type":"ContainerStarted","Data":"4f6d67b600411553a6794f30d505508f6678d9184e1f53ded8de88dfa54cb691"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.284628 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.291457 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" event={"ID":"774f91c0-0433-43a5-8b33-18a5253ba0a3","Type":"ContainerStarted","Data":"8251c762191ac77d3e64fa36ce6fb5c10f1359648989ed42e9fdcfaddad361e2"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.291496 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" event={"ID":"774f91c0-0433-43a5-8b33-18a5253ba0a3","Type":"ContainerStarted","Data":"fda6cd33b353ebc8bb6ad7b7bf73e7a1c3024b76e4911b2c1b903c911e7df687"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.296542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" event={"ID":"1dc6224f-2bf9-4c28-a6df-30a177430c08","Type":"ContainerStarted","Data":"d9c5d35c2da7c0c72ec0450a13a4847f34682f920144393f9e0028d66015d79d"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.296598 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" event={"ID":"1dc6224f-2bf9-4c28-a6df-30a177430c08","Type":"ContainerStarted","Data":"3fbe65bed0aaefa9d93ca9ad8c9f97eb9a1085ea3b4f7bb4b00931d95c887a03"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.296624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" event={"ID":"1dc6224f-2bf9-4c28-a6df-30a177430c08","Type":"ContainerStarted","Data":"f9f3078df427e584d361f4f8c92c3f0ff33fdf0b7f32b71ae71f7b58c72ad11d"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.298688 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" event={"ID":"bbef2e1f-1be1-4624-804c-45892231df1e","Type":"ContainerStarted","Data":"1b0eb3c56ccd014ace15a0c56f6e7ca89dca71b83fe0c0c42006cc66ad1f972c"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.298714 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" event={"ID":"bbef2e1f-1be1-4624-804c-45892231df1e","Type":"ContainerStarted","Data":"cd42c579be0f111294d33e7a5d28454d1b6907a75ca9a4d06696d89d68848920"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.298894 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.300146 4907 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-p9vbb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.300183 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.301160 4907 generic.go:334] "Generic (PLEG): container finished" podID="0383e657-c434-43b2-878b-314ce5a2339e" containerID="4133626312e7126303a9eea2dbb15f65bd35131949e9e3a99f32033bf4017617" exitCode=0 Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.301714 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" event={"ID":"0383e657-c434-43b2-878b-314ce5a2339e","Type":"ContainerDied","Data":"4133626312e7126303a9eea2dbb15f65bd35131949e9e3a99f32033bf4017617"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.301743 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" event={"ID":"0383e657-c434-43b2-878b-314ce5a2339e","Type":"ContainerStarted","Data":"b6982d33e03b132a3ff0e5a9169591a8a7f1d636bdf6af256b3417abcb922845"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.304979 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.305039 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" event={"ID":"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4","Type":"ContainerStarted","Data":"2db5bf494c35caae9b48fdc4a0eb9eed08a28580ef3a9e57fd25d25616f37465"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.309339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" event={"ID":"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7","Type":"ContainerStarted","Data":"50de9e4df20ad528c94657808201d7bb84c648ecb71f095e22a16d8b96d387f7"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.309386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" event={"ID":"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7","Type":"ContainerStarted","Data":"0abc77475b70de5e84e45ab61d7d61754c75698302f1dae59a58b868419e953d"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.310901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" event={"ID":"54942a44-6e66-4757-8106-bbe836a2d8ca","Type":"ContainerStarted","Data":"d2644c3d16f2880f068ed3473b9f1e9b0826ed05a9392b6c9676ae3feabfd916"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.310928 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" event={"ID":"54942a44-6e66-4757-8106-bbe836a2d8ca","Type":"ContainerStarted","Data":"82fead9ff16b326232a8cf7cec57cfb267f152ad0e74601d2af4a4c7cacd110d"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.311093 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.312382 4907 generic.go:334] "Generic (PLEG): container finished" podID="f1a111d0-85de-4328-90ac-9b9af3edbc49" containerID="c09de37c95e9752af4044b374731b9cce74fd314e76270ba6c11b94ec9f66250" exitCode=0 Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.312411 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" event={"ID":"f1a111d0-85de-4328-90ac-9b9af3edbc49","Type":"ContainerDied","Data":"c09de37c95e9752af4044b374731b9cce74fd314e76270ba6c11b94ec9f66250"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.312428 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" event={"ID":"f1a111d0-85de-4328-90ac-9b9af3edbc49","Type":"ContainerStarted","Data":"c431076fcfc0f87431e7bb611b688b34afebfc7a2d5350b0ea39f243068e003c"} Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.312910 4907 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-96swm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.312937 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.325932 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.344815 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.384521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpn27\" (UniqueName: \"kubernetes.io/projected/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-kube-api-access-cpn27\") pod \"console-f9d7485db-9lx5z\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.401965 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f4c2\" (UniqueName: \"kubernetes.io/projected/36952148-e6b5-4c20-8016-3de7f571420e-kube-api-access-8f4c2\") pod \"authentication-operator-69f744f599-wtjfv\" (UID: \"36952148-e6b5-4c20-8016-3de7f571420e\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.420247 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ef4c8a6a-c008-406e-8aed-2164e582f710-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.443534 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclpq\" (UniqueName: \"kubernetes.io/projected/ef4c8a6a-c008-406e-8aed-2164e582f710-kube-api-access-lclpq\") pod \"cluster-image-registry-operator-dc59b4c8b-lknds\" (UID: \"ef4c8a6a-c008-406e-8aed-2164e582f710\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.461338 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbmc\" (UniqueName: \"kubernetes.io/projected/2e969445-2d6b-4ea1-bd4b-3473a66e8c91-kube-api-access-rfbmc\") pod \"downloads-7954f5f757-wcgj6\" (UID: \"2e969445-2d6b-4ea1-bd4b-3473a66e8c91\") " pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.465691 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.483392 4907 request.go:700] Waited for 1.897537616s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Ddefault-dockercfg-2llfx&limit=500&resourceVersion=0 Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.485098 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.505202 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.526314 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.546219 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.564861 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.584492 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.605389 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.626164 4907 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.647351 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.668538 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.681222 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.696317 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.718398 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.769697 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fefaf3e-d327-41f8-bbbe-94b051a63b19-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.769736 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9aeee88-40a0-4c8a-aebf-680cf878f42e-config\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.769756 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-trusted-ca\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.769803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvt59\" (UniqueName: \"kubernetes.io/projected/2305f4ed-b155-4e30-b83c-7dde9bec7b28-kube-api-access-dvt59\") pod \"dns-operator-744455d44c-tvpcl\" (UID: \"2305f4ed-b155-4e30-b83c-7dde9bec7b28\") " pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.770832 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-bound-sa-token\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.770874 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzrs\" (UniqueName: \"kubernetes.io/projected/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-kube-api-access-nkzrs\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.770899 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9aeee88-40a0-4c8a-aebf-680cf878f42e-serving-cert\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.770919 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/317291a5-1f7f-4d5a-8779-7c769dae2bc5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.770997 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s48n\" (UniqueName: \"kubernetes.io/projected/317291a5-1f7f-4d5a-8779-7c769dae2bc5-kube-api-access-6s48n\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.771078 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cth8\" (UniqueName: \"kubernetes.io/projected/e9aeee88-40a0-4c8a-aebf-680cf878f42e-kube-api-access-2cth8\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.771566 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed605a31-991f-4fcc-a861-3bfe94c7b92c-serving-cert\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.771819 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.771848 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-certificates\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.772172 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkk62\" (UniqueName: \"kubernetes.io/projected/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-kube-api-access-nkk62\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.772565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.772662 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-metrics-tls\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.773001 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-client\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.773044 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-trusted-ca\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.773066 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.773143 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9aeee88-40a0-4c8a-aebf-680cf878f42e-trusted-ca\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.773337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fefaf3e-d327-41f8-bbbe-94b051a63b19-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.773398 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775014 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/317291a5-1f7f-4d5a-8779-7c769dae2bc5-proxy-tls\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-service-ca\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775255 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kj7\" (UniqueName: \"kubernetes.io/projected/23df369e-238f-4fbc-99fa-b22c21011db0-kube-api-access-g6kj7\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775348 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/af8aa9df-432b-40bd-847c-c3539b32cb59-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w9nx4\" (UID: \"af8aa9df-432b-40bd-847c-c3539b32cb59\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775445 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-ca\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775560 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4l8g\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-kube-api-access-h4l8g\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775709 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-config\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775802 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-tls\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.775926 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.776020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2305f4ed-b155-4e30-b83c-7dde9bec7b28-metrics-tls\") pod \"dns-operator-744455d44c-tvpcl\" (UID: \"2305f4ed-b155-4e30-b83c-7dde9bec7b28\") " pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.776088 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.776169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/317291a5-1f7f-4d5a-8779-7c769dae2bc5-images\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.776238 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77796\" (UniqueName: \"kubernetes.io/projected/af8aa9df-432b-40bd-847c-c3539b32cb59-kube-api-access-77796\") pod \"control-plane-machine-set-operator-78cbb6b69f-w9nx4\" (UID: \"af8aa9df-432b-40bd-847c-c3539b32cb59\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.776309 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbm8v\" (UniqueName: \"kubernetes.io/projected/ed605a31-991f-4fcc-a861-3bfe94c7b92c-kube-api-access-sbm8v\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: E0226 15:46:20.779697 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.279681743 +0000 UTC m=+243.798243682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.877911 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878103 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnmft\" (UniqueName: \"kubernetes.io/projected/c6986b68-4a8d-4677-bed1-493eb1a231c3-kube-api-access-rnmft\") pod \"auto-csr-approver-29535346-hhrww\" (UID: \"c6986b68-4a8d-4677-bed1-493eb1a231c3\") " pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fefaf3e-d327-41f8-bbbe-94b051a63b19-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878190 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-mountpoint-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878214 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-trusted-ca\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878237 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dd74211-40c2-437c-9295-b69e709f81fe-secret-volume\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a766dd26-3d8c-464c-b873-f03d3895b9d1-srv-cert\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878289 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvt59\" (UniqueName: \"kubernetes.io/projected/2305f4ed-b155-4e30-b83c-7dde9bec7b28-kube-api-access-dvt59\") pod \"dns-operator-744455d44c-tvpcl\" (UID: \"2305f4ed-b155-4e30-b83c-7dde9bec7b28\") " pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a766dd26-3d8c-464c-b873-f03d3895b9d1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-metrics-tls\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878384 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lgl6\" (UniqueName: \"kubernetes.io/projected/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-kube-api-access-6lgl6\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878406 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dd74211-40c2-437c-9295-b69e709f81fe-config-volume\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878427 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87018111-567b-4b30-a141-f20b606728e9-config\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878452 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc9th\" (UniqueName: \"kubernetes.io/projected/2ec425f0-76a0-445f-8d38-a4f125da3312-kube-api-access-qc9th\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878513 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7xh\" (UniqueName: \"kubernetes.io/projected/05322669-16de-41ca-9ae9-3580b5cdda05-kube-api-access-fb7xh\") pod \"migrator-59844c95c7-gtnvm\" (UID: \"05322669-16de-41ca-9ae9-3580b5cdda05\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878552 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzrs\" (UniqueName: \"kubernetes.io/projected/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-kube-api-access-nkzrs\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/317291a5-1f7f-4d5a-8779-7c769dae2bc5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878623 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdhd\" (UniqueName: \"kubernetes.io/projected/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-kube-api-access-6zdhd\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878645 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-registration-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878681 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-667f5\" (UniqueName: \"kubernetes.io/projected/0dd74211-40c2-437c-9295-b69e709f81fe-kube-api-access-667f5\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878728 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878762 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed605a31-991f-4fcc-a861-3bfe94c7b92c-serving-cert\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kvq8\" (UniqueName: \"kubernetes.io/projected/f6e20f97-f90b-41d7-905e-f627e07b0dfb-kube-api-access-7kvq8\") pod \"ingress-canary-hs7mv\" (UID: \"f6e20f97-f90b-41d7-905e-f627e07b0dfb\") " pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878823 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-certificates\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.878851 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa04e41-18ce-4928-b012-ae804b9cfafc-config\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.879242 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fefaf3e-d327-41f8-bbbe-94b051a63b19-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: E0226 15:46:20.879329 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.379309758 +0000 UTC m=+243.897871617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.880349 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-trusted-ca\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.880625 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkk62\" (UniqueName: \"kubernetes.io/projected/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-kube-api-access-nkk62\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.880662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-client\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.888462 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-certificates\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/70ad9c23-ce1d-4b1a-979d-08d20761353e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rlmpn\" (UID: \"70ad9c23-ce1d-4b1a-979d-08d20761353e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890458 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-signing-cabundle\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890502 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890527 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e13006-b114-4c3f-8669-62afc695914b-config\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890570 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9n5h\" (UniqueName: \"kubernetes.io/projected/70ad9c23-ce1d-4b1a-979d-08d20761353e-kube-api-access-v9n5h\") pod \"package-server-manager-789f6589d5-rlmpn\" (UID: \"70ad9c23-ce1d-4b1a-979d-08d20761353e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890638 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6e20f97-f90b-41d7-905e-f627e07b0dfb-cert\") pod \"ingress-canary-hs7mv\" (UID: \"f6e20f97-f90b-41d7-905e-f627e07b0dfb\") " pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-service-ca\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890683 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2ec425f0-76a0-445f-8d38-a4f125da3312-profile-collector-cert\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890723 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6kj7\" (UniqueName: \"kubernetes.io/projected/23df369e-238f-4fbc-99fa-b22c21011db0-kube-api-access-g6kj7\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-node-bootstrap-token\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md2jt\" (UniqueName: \"kubernetes.io/projected/def12a12-3cf0-4694-a957-3e69aa18f880-kube-api-access-md2jt\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890810 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b678693-5390-4ce1-bf51-a2da37343241-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890847 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4l8g\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-kube-api-access-h4l8g\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-config\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890908 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnlnc\" (UniqueName: \"kubernetes.io/projected/d01c15cd-3103-49df-afdd-e6f6d6f35716-kube-api-access-rnlnc\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-signing-key\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:20 crc kubenswrapper[4907]: E0226 15:46:20.890939 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.390923684 +0000 UTC m=+243.909485533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.890962 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-metrics-certs\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101ef487-124a-40ce-bf7d-8b7efcab6765-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891084 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891107 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6aa04e41-18ce-4928-b012-ae804b9cfafc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891128 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9mmk\" (UniqueName: \"kubernetes.io/projected/2a0a1c34-d485-449a-86c9-8c4631a023b5-kube-api-access-l9mmk\") pod \"multus-admission-controller-857f4d67dd-fd8f2\" (UID: \"2a0a1c34-d485-449a-86c9-8c4631a023b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891149 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c5rq\" (UniqueName: \"kubernetes.io/projected/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-kube-api-access-9c5rq\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891172 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2a0a1c34-d485-449a-86c9-8c4631a023b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fd8f2\" (UID: \"2a0a1c34-d485-449a-86c9-8c4631a023b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891194 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891213 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e13006-b114-4c3f-8669-62afc695914b-serving-cert\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891238 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891261 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/317291a5-1f7f-4d5a-8779-7c769dae2bc5-images\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77796\" (UniqueName: \"kubernetes.io/projected/af8aa9df-432b-40bd-847c-c3539b32cb59-kube-api-access-77796\") pod \"control-plane-machine-set-operator-78cbb6b69f-w9nx4\" (UID: \"af8aa9df-432b-40bd-847c-c3539b32cb59\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891307 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-csi-data-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891330 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4q6\" (UniqueName: \"kubernetes.io/projected/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-kube-api-access-hg4q6\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891356 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbm8v\" (UniqueName: \"kubernetes.io/projected/ed605a31-991f-4fcc-a861-3bfe94c7b92c-kube-api-access-sbm8v\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891381 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-plugins-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891404 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/def12a12-3cf0-4694-a957-3e69aa18f880-service-ca-bundle\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9aeee88-40a0-4c8a-aebf-680cf878f42e-config\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891459 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b678693-5390-4ce1-bf51-a2da37343241-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-default-certificate\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa04e41-18ce-4928-b012-ae804b9cfafc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87018111-567b-4b30-a141-f20b606728e9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891573 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-bound-sa-token\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891618 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-socket-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891642 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101ef487-124a-40ce-bf7d-8b7efcab6765-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891668 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9aeee88-40a0-4c8a-aebf-680cf878f42e-serving-cert\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891690 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4qn\" (UniqueName: \"kubernetes.io/projected/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-kube-api-access-vb4qn\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cth8\" (UniqueName: \"kubernetes.io/projected/e9aeee88-40a0-4c8a-aebf-680cf878f42e-kube-api-access-2cth8\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891762 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s48n\" (UniqueName: \"kubernetes.io/projected/317291a5-1f7f-4d5a-8779-7c769dae2bc5-kube-api-access-6s48n\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891855 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-tmpfs\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891900 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b678693-5390-4ce1-bf51-a2da37343241-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.891986 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892067 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-metrics-tls\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892108 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-trusted-ca\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9aeee88-40a0-4c8a-aebf-680cf878f42e-trusted-ca\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892168 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2ec425f0-76a0-445f-8d38-a4f125da3312-srv-cert\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892200 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fefaf3e-d327-41f8-bbbe-94b051a63b19-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892276 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/317291a5-1f7f-4d5a-8779-7c769dae2bc5-proxy-tls\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892292 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87018111-567b-4b30-a141-f20b606728e9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892334 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-ca\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892355 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/af8aa9df-432b-40bd-847c-c3539b32cb59-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w9nx4\" (UID: \"af8aa9df-432b-40bd-847c-c3539b32cb59\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892377 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjbpf\" (UniqueName: \"kubernetes.io/projected/a766dd26-3d8c-464c-b873-f03d3895b9d1-kube-api-access-jjbpf\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892395 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv5x9\" (UniqueName: \"kubernetes.io/projected/101ef487-124a-40ce-bf7d-8b7efcab6765-kube-api-access-pv5x9\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892423 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhbhg\" (UniqueName: \"kubernetes.io/projected/1b0532e1-9350-435d-bb1f-72bb0931a2e8-kube-api-access-bhbhg\") pod \"auto-csr-approver-29535344-fsndq\" (UID: \"1b0532e1-9350-435d-bb1f-72bb0931a2e8\") " pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-stats-auth\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892473 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-webhook-cert\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892491 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-tls\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892507 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-proxy-tls\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2305f4ed-b155-4e30-b83c-7dde9bec7b28-metrics-tls\") pod \"dns-operator-744455d44c-tvpcl\" (UID: \"2305f4ed-b155-4e30-b83c-7dde9bec7b28\") " pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-config-volume\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892572 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-certs\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.892634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqdkw\" (UniqueName: \"kubernetes.io/projected/80e13006-b114-4c3f-8669-62afc695914b-kube-api-access-kqdkw\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.899506 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-service-ca\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.901002 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-config\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.901302 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.901424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/317291a5-1f7f-4d5a-8779-7c769dae2bc5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.901459 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9aeee88-40a0-4c8a-aebf-680cf878f42e-config\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.902671 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-ca\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.903009 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/317291a5-1f7f-4d5a-8779-7c769dae2bc5-images\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.925702 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.931981 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ed605a31-991f-4fcc-a861-3bfe94c7b92c-etcd-client\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.932838 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-trusted-ca\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.933127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9aeee88-40a0-4c8a-aebf-680cf878f42e-trusted-ca\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.941181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvt59\" (UniqueName: \"kubernetes.io/projected/2305f4ed-b155-4e30-b83c-7dde9bec7b28-kube-api-access-dvt59\") pod \"dns-operator-744455d44c-tvpcl\" (UID: \"2305f4ed-b155-4e30-b83c-7dde9bec7b28\") " pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.941806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2305f4ed-b155-4e30-b83c-7dde9bec7b28-metrics-tls\") pod \"dns-operator-744455d44c-tvpcl\" (UID: \"2305f4ed-b155-4e30-b83c-7dde9bec7b28\") " pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.944819 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fefaf3e-d327-41f8-bbbe-94b051a63b19-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.945138 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-tls\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.946031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/317291a5-1f7f-4d5a-8779-7c769dae2bc5-proxy-tls\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.950260 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkk62\" (UniqueName: \"kubernetes.io/projected/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-kube-api-access-nkk62\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.952342 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.958680 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed605a31-991f-4fcc-a861-3bfe94c7b92c-serving-cert\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.958756 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-metrics-tls\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.959057 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9aeee88-40a0-4c8a-aebf-680cf878f42e-serving-cert\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.959146 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzrs\" (UniqueName: \"kubernetes.io/projected/1c8904fd-8dd8-418a-b32a-eb1ccf934fec-kube-api-access-nkzrs\") pod \"openshift-controller-manager-operator-756b6f6bc6-4z8ql\" (UID: \"1c8904fd-8dd8-418a-b32a-eb1ccf934fec\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.961598 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/af8aa9df-432b-40bd-847c-c3539b32cb59-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-w9nx4\" (UID: \"af8aa9df-432b-40bd-847c-c3539b32cb59\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.979215 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.979713 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wtjfv"] Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.988962 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994083 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994264 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2ec425f0-76a0-445f-8d38-a4f125da3312-srv-cert\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994287 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87018111-567b-4b30-a141-f20b606728e9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994314 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjbpf\" (UniqueName: \"kubernetes.io/projected/a766dd26-3d8c-464c-b873-f03d3895b9d1-kube-api-access-jjbpf\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994330 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv5x9\" (UniqueName: \"kubernetes.io/projected/101ef487-124a-40ce-bf7d-8b7efcab6765-kube-api-access-pv5x9\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994350 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhbhg\" (UniqueName: \"kubernetes.io/projected/1b0532e1-9350-435d-bb1f-72bb0931a2e8-kube-api-access-bhbhg\") pod \"auto-csr-approver-29535344-fsndq\" (UID: \"1b0532e1-9350-435d-bb1f-72bb0931a2e8\") " pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-stats-auth\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994394 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-webhook-cert\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-proxy-tls\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994424 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-config-volume\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994440 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqdkw\" (UniqueName: \"kubernetes.io/projected/80e13006-b114-4c3f-8669-62afc695914b-kube-api-access-kqdkw\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994455 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-certs\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnmft\" (UniqueName: \"kubernetes.io/projected/c6986b68-4a8d-4677-bed1-493eb1a231c3-kube-api-access-rnmft\") pod \"auto-csr-approver-29535346-hhrww\" (UID: \"c6986b68-4a8d-4677-bed1-493eb1a231c3\") " pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994489 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-mountpoint-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994504 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dd74211-40c2-437c-9295-b69e709f81fe-secret-volume\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994518 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a766dd26-3d8c-464c-b873-f03d3895b9d1-srv-cert\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994533 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a766dd26-3d8c-464c-b873-f03d3895b9d1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994548 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-metrics-tls\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994570 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lgl6\" (UniqueName: \"kubernetes.io/projected/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-kube-api-access-6lgl6\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994602 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dd74211-40c2-437c-9295-b69e709f81fe-config-volume\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994616 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87018111-567b-4b30-a141-f20b606728e9-config\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994632 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc9th\" (UniqueName: \"kubernetes.io/projected/2ec425f0-76a0-445f-8d38-a4f125da3312-kube-api-access-qc9th\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994650 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7xh\" (UniqueName: \"kubernetes.io/projected/05322669-16de-41ca-9ae9-3580b5cdda05-kube-api-access-fb7xh\") pod \"migrator-59844c95c7-gtnvm\" (UID: \"05322669-16de-41ca-9ae9-3580b5cdda05\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994665 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdhd\" (UniqueName: \"kubernetes.io/projected/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-kube-api-access-6zdhd\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994680 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-registration-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-667f5\" (UniqueName: \"kubernetes.io/projected/0dd74211-40c2-437c-9295-b69e709f81fe-kube-api-access-667f5\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kvq8\" (UniqueName: \"kubernetes.io/projected/f6e20f97-f90b-41d7-905e-f627e07b0dfb-kube-api-access-7kvq8\") pod \"ingress-canary-hs7mv\" (UID: \"f6e20f97-f90b-41d7-905e-f627e07b0dfb\") " pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994734 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa04e41-18ce-4928-b012-ae804b9cfafc-config\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994751 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/70ad9c23-ce1d-4b1a-979d-08d20761353e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rlmpn\" (UID: \"70ad9c23-ce1d-4b1a-979d-08d20761353e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994768 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-signing-cabundle\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e13006-b114-4c3f-8669-62afc695914b-config\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994799 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-mountpoint-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994807 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9n5h\" (UniqueName: \"kubernetes.io/projected/70ad9c23-ce1d-4b1a-979d-08d20761353e-kube-api-access-v9n5h\") pod \"package-server-manager-789f6589d5-rlmpn\" (UID: \"70ad9c23-ce1d-4b1a-979d-08d20761353e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994834 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6e20f97-f90b-41d7-905e-f627e07b0dfb-cert\") pod \"ingress-canary-hs7mv\" (UID: \"f6e20f97-f90b-41d7-905e-f627e07b0dfb\") " pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2ec425f0-76a0-445f-8d38-a4f125da3312-profile-collector-cert\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994886 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-node-bootstrap-token\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md2jt\" (UniqueName: \"kubernetes.io/projected/def12a12-3cf0-4694-a957-3e69aa18f880-kube-api-access-md2jt\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994928 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b678693-5390-4ce1-bf51-a2da37343241-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994944 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnlnc\" (UniqueName: \"kubernetes.io/projected/d01c15cd-3103-49df-afdd-e6f6d6f35716-kube-api-access-rnlnc\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994958 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-signing-key\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994972 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.994987 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101ef487-124a-40ce-bf7d-8b7efcab6765-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.995002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-metrics-certs\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.995016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6aa04e41-18ce-4928-b012-ae804b9cfafc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.995031 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9mmk\" (UniqueName: \"kubernetes.io/projected/2a0a1c34-d485-449a-86c9-8c4631a023b5-kube-api-access-l9mmk\") pod \"multus-admission-controller-857f4d67dd-fd8f2\" (UID: \"2a0a1c34-d485-449a-86c9-8c4631a023b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.995046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c5rq\" (UniqueName: \"kubernetes.io/projected/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-kube-api-access-9c5rq\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.995061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2a0a1c34-d485-449a-86c9-8c4631a023b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fd8f2\" (UID: \"2a0a1c34-d485-449a-86c9-8c4631a023b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:20 crc kubenswrapper[4907]: I0226 15:46:20.995076 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e13006-b114-4c3f-8669-62afc695914b-serving-cert\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995094 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995113 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-csi-data-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4q6\" (UniqueName: \"kubernetes.io/projected/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-kube-api-access-hg4q6\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995148 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-plugins-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995162 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/def12a12-3cf0-4694-a957-3e69aa18f880-service-ca-bundle\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995178 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b678693-5390-4ce1-bf51-a2da37343241-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995194 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-default-certificate\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995209 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa04e41-18ce-4928-b012-ae804b9cfafc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87018111-567b-4b30-a141-f20b606728e9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-socket-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995267 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101ef487-124a-40ce-bf7d-8b7efcab6765-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995282 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4qn\" (UniqueName: \"kubernetes.io/projected/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-kube-api-access-vb4qn\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995317 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-tmpfs\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.995332 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b678693-5390-4ce1-bf51-a2da37343241-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:20.995463 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.495450096 +0000 UTC m=+244.014011945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.996028 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-registration-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.996527 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87018111-567b-4b30-a141-f20b606728e9-config\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.997107 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dd74211-40c2-437c-9295-b69e709f81fe-config-volume\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.998472 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6aa04e41-18ce-4928-b012-ae804b9cfafc-config\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:20.999457 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a766dd26-3d8c-464c-b873-f03d3895b9d1-srv-cert\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.004044 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e13006-b114-4c3f-8669-62afc695914b-config\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.004629 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a766dd26-3d8c-464c-b873-f03d3895b9d1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.004754 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-signing-cabundle\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.005564 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-plugins-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.010350 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.010896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dd74211-40c2-437c-9295-b69e709f81fe-secret-volume\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.011482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101ef487-124a-40ce-bf7d-8b7efcab6765-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.011549 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-socket-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.012142 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b678693-5390-4ce1-bf51-a2da37343241-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.012727 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/def12a12-3cf0-4694-a957-3e69aa18f880-service-ca-bundle\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.015497 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87018111-567b-4b30-a141-f20b606728e9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.019490 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6aa04e41-18ce-4928-b012-ae804b9cfafc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.020539 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80e13006-b114-4c3f-8669-62afc695914b-serving-cert\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.021436 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.021531 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d01c15cd-3103-49df-afdd-e6f6d6f35716-csi-data-dir\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.021828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1e8a3f9-9de9-4181-869f-9fce597e6b5b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gnh8z\" (UID: \"f1e8a3f9-9de9-4181-869f-9fce597e6b5b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.022236 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-tmpfs\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.022387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-metrics-certs\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.027183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/70ad9c23-ce1d-4b1a-979d-08d20761353e-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rlmpn\" (UID: \"70ad9c23-ce1d-4b1a-979d-08d20761353e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.027306 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101ef487-124a-40ce-bf7d-8b7efcab6765-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.028270 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2a0a1c34-d485-449a-86c9-8c4631a023b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fd8f2\" (UID: \"2a0a1c34-d485-449a-86c9-8c4631a023b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.032837 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-apiservice-cert\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.038972 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-signing-key\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.041488 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-certs\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.041488 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-config-volume\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.042049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-webhook-cert\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.047442 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f6e20f97-f90b-41d7-905e-f627e07b0dfb-cert\") pod \"ingress-canary-hs7mv\" (UID: \"f6e20f97-f90b-41d7-905e-f627e07b0dfb\") " pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.048966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-default-certificate\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.091358 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2ec425f0-76a0-445f-8d38-a4f125da3312-profile-collector-cert\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.091823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-proxy-tls\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.092157 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b678693-5390-4ce1-bf51-a2da37343241-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.092568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/def12a12-3cf0-4694-a957-3e69aa18f880-stats-auth\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.093108 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-node-bootstrap-token\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.095195 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6kj7\" (UniqueName: \"kubernetes.io/projected/23df369e-238f-4fbc-99fa-b22c21011db0-kube-api-access-g6kj7\") pod \"marketplace-operator-79b997595-dvcn5\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.098556 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-metrics-tls\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.099016 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-bound-sa-token\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.100530 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.100947 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.60093715 +0000 UTC m=+244.119498999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.114557 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.116624 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2ec425f0-76a0-445f-8d38-a4f125da3312-srv-cert\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.117602 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77796\" (UniqueName: \"kubernetes.io/projected/af8aa9df-432b-40bd-847c-c3539b32cb59-kube-api-access-77796\") pod \"control-plane-machine-set-operator-78cbb6b69f-w9nx4\" (UID: \"af8aa9df-432b-40bd-847c-c3539b32cb59\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.124100 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cth8\" (UniqueName: \"kubernetes.io/projected/e9aeee88-40a0-4c8a-aebf-680cf878f42e-kube-api-access-2cth8\") pod \"console-operator-58897d9998-sjflz\" (UID: \"e9aeee88-40a0-4c8a-aebf-680cf878f42e\") " pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.127457 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4l8g\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-kube-api-access-h4l8g\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.128350 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbm8v\" (UniqueName: \"kubernetes.io/projected/ed605a31-991f-4fcc-a861-3bfe94c7b92c-kube-api-access-sbm8v\") pod \"etcd-operator-b45778765-8wmgt\" (UID: \"ed605a31-991f-4fcc-a861-3bfe94c7b92c\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.135329 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s48n\" (UniqueName: \"kubernetes.io/projected/317291a5-1f7f-4d5a-8779-7c769dae2bc5-kube-api-access-6s48n\") pod \"machine-config-operator-74547568cd-4z9rn\" (UID: \"317291a5-1f7f-4d5a-8779-7c769dae2bc5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.173512 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9n5h\" (UniqueName: \"kubernetes.io/projected/70ad9c23-ce1d-4b1a-979d-08d20761353e-kube-api-access-v9n5h\") pod \"package-server-manager-789f6589d5-rlmpn\" (UID: \"70ad9c23-ce1d-4b1a-979d-08d20761353e\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.195124 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b678693-5390-4ce1-bf51-a2da37343241-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-hztww\" (UID: \"8b678693-5390-4ce1-bf51-a2da37343241\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.210578 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.211380 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.711359312 +0000 UTC m=+244.229921161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.223203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lgl6\" (UniqueName: \"kubernetes.io/projected/6e6248c0-ae25-48db-9112-ffeb9f9ca6a2-kube-api-access-6lgl6\") pod \"dns-default-6g628\" (UID: \"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2\") " pod="openshift-dns/dns-default-6g628" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.230918 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kvq8\" (UniqueName: \"kubernetes.io/projected/f6e20f97-f90b-41d7-905e-f627e07b0dfb-kube-api-access-7kvq8\") pod \"ingress-canary-hs7mv\" (UID: \"f6e20f97-f90b-41d7-905e-f627e07b0dfb\") " pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.231407 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6g628" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.241119 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hs7mv" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.247152 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wcgj6"] Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.256265 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdhd\" (UniqueName: \"kubernetes.io/projected/b71208e6-41a8-44a3-a8fd-7380b1da6ffa-kube-api-access-6zdhd\") pod \"machine-config-controller-84d6567774-rsw5p\" (UID: \"b71208e6-41a8-44a3-a8fd-7380b1da6ffa\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.278198 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.286956 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7xh\" (UniqueName: \"kubernetes.io/projected/05322669-16de-41ca-9ae9-3580b5cdda05-kube-api-access-fb7xh\") pod \"migrator-59844c95c7-gtnvm\" (UID: \"05322669-16de-41ca-9ae9-3580b5cdda05\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.287062 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-667f5\" (UniqueName: \"kubernetes.io/projected/0dd74211-40c2-437c-9295-b69e709f81fe-kube-api-access-667f5\") pod \"collect-profiles-29535345-b7r88\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.300941 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9lx5z"] Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.300983 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds"] Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.313032 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.314087 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.314525 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.814513452 +0000 UTC m=+244.333075301 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: W0226 15:46:21.323861 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e969445_2d6b_4ea1_bd4b_3473a66e8c91.slice/crio-f5e979e0d732d246d701971319bc5ce590fb678c30052cb5ed95d9f9dc84c5b3 WatchSource:0}: Error finding container f5e979e0d732d246d701971319bc5ce590fb678c30052cb5ed95d9f9dc84c5b3: Status 404 returned error can't find the container with id f5e979e0d732d246d701971319bc5ce590fb678c30052cb5ed95d9f9dc84c5b3 Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.336849 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.337638 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc9th\" (UniqueName: \"kubernetes.io/projected/2ec425f0-76a0-445f-8d38-a4f125da3312-kube-api-access-qc9th\") pod \"catalog-operator-68c6474976-z86sf\" (UID: \"2ec425f0-76a0-445f-8d38-a4f125da3312\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.341764 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.346677 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4qn\" (UniqueName: \"kubernetes.io/projected/73dbdcc4-3b7d-4593-a8e1-b15f824b5670-kube-api-access-vb4qn\") pod \"packageserver-d55dfcdfc-z6m64\" (UID: \"73dbdcc4-3b7d-4593-a8e1-b15f824b5670\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.348735 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.359849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/87018111-567b-4b30-a141-f20b606728e9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8fhkw\" (UID: \"87018111-567b-4b30-a141-f20b606728e9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.369167 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" event={"ID":"0383e657-c434-43b2-878b-314ce5a2339e","Type":"ContainerStarted","Data":"4326c0a2b28e2e5bcedc7ee341d86e29388eed353814648a831881e99c8d534c"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.369443 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.369511 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.369640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.381608 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.387609 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.395257 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" event={"ID":"05bd4fd2-624b-4b9c-b6a7-74cfce90e1d7","Type":"ContainerStarted","Data":"59d5f96d12255c6985c0b12e659a121ada266151b503e5d2e622c15e327e18dd"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.395678 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.401298 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjbpf\" (UniqueName: \"kubernetes.io/projected/a766dd26-3d8c-464c-b873-f03d3895b9d1-kube-api-access-jjbpf\") pod \"olm-operator-6b444d44fb-sw4qw\" (UID: \"a766dd26-3d8c-464c-b873-f03d3895b9d1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.407805 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhbhg\" (UniqueName: \"kubernetes.io/projected/1b0532e1-9350-435d-bb1f-72bb0931a2e8-kube-api-access-bhbhg\") pod \"auto-csr-approver-29535344-fsndq\" (UID: \"1b0532e1-9350-435d-bb1f-72bb0931a2e8\") " pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.410773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv5x9\" (UniqueName: \"kubernetes.io/projected/101ef487-124a-40ce-bf7d-8b7efcab6765-kube-api-access-pv5x9\") pod \"kube-storage-version-migrator-operator-b67b599dd-k2mmn\" (UID: \"101ef487-124a-40ce-bf7d-8b7efcab6765\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.413250 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" event={"ID":"36952148-e6b5-4c20-8016-3de7f571420e","Type":"ContainerStarted","Data":"8288237d50f59099df31d3da0026bd1d0e2a4790c6c93d7c62f5f48f88321083"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.413297 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" event={"ID":"36952148-e6b5-4c20-8016-3de7f571420e","Type":"ContainerStarted","Data":"64f84df75539ab87e53d1caa3711d3ceaf0edc468e92d6cc0a57e9d5861e44a6"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.414785 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.416103 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:21.916083893 +0000 UTC m=+244.434645742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.418120 4907 generic.go:334] "Generic (PLEG): container finished" podID="1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4" containerID="e670d88418a8be5f1780dd528fa20f15667e220c477eec3e2b68203156b87561" exitCode=0 Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.418817 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" event={"ID":"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4","Type":"ContainerDied","Data":"e670d88418a8be5f1780dd528fa20f15667e220c477eec3e2b68203156b87561"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.418942 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.435618 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6aa04e41-18ce-4928-b012-ae804b9cfafc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-ks676\" (UID: \"6aa04e41-18ce-4928-b012-ae804b9cfafc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.437801 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.462327 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.485162 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.485288 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9mmk\" (UniqueName: \"kubernetes.io/projected/2a0a1c34-d485-449a-86c9-8c4631a023b5-kube-api-access-l9mmk\") pod \"multus-admission-controller-857f4d67dd-fd8f2\" (UID: \"2a0a1c34-d485-449a-86c9-8c4631a023b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.488084 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.518888 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c5rq\" (UniqueName: \"kubernetes.io/projected/14efe72b-80f8-4748-bcc9-e4f20a7eb28e-kube-api-access-9c5rq\") pod \"service-ca-9c57cc56f-z5rgk\" (UID: \"14efe72b-80f8-4748-bcc9-e4f20a7eb28e\") " pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.519517 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.519847 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.019836138 +0000 UTC m=+244.538397987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.522168 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.524421 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" event={"ID":"f1a111d0-85de-4328-90ac-9b9af3edbc49","Type":"ContainerStarted","Data":"60420c3e8f7c3d7142c1bc257e3e4ff91c38001ccb195c27db15a8d475f2ec8c"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.524935 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" event={"ID":"f1a111d0-85de-4328-90ac-9b9af3edbc49","Type":"ContainerStarted","Data":"d0203611303ef391d7aa69a4fd52efa16c172e6aa0d7f15834053f1849a1d127"} Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.526287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4q6\" (UniqueName: \"kubernetes.io/projected/ae456f0d-bf77-4e93-9bf2-c47c27b8eadf-kube-api-access-hg4q6\") pod \"machine-config-server-k8lkj\" (UID: \"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf\") " pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.557725 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqdkw\" (UniqueName: \"kubernetes.io/projected/80e13006-b114-4c3f-8669-62afc695914b-kube-api-access-kqdkw\") pod \"service-ca-operator-777779d784-gvx2f\" (UID: \"80e13006-b114-4c3f-8669-62afc695914b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.557874 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-k8lkj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.561140 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.569551 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md2jt\" (UniqueName: \"kubernetes.io/projected/def12a12-3cf0-4694-a957-3e69aa18f880-kube-api-access-md2jt\") pod \"router-default-5444994796-hqs2t\" (UID: \"def12a12-3cf0-4694-a957-3e69aa18f880\") " pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.582748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnlnc\" (UniqueName: \"kubernetes.io/projected/d01c15cd-3103-49df-afdd-e6f6d6f35716-kube-api-access-rnlnc\") pod \"csi-hostpathplugin-l5fqj\" (UID: \"d01c15cd-3103-49df-afdd-e6f6d6f35716\") " pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.583261 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.615246 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.620537 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.621502 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.121488521 +0000 UTC m=+244.640050370 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.625406 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnmft\" (UniqueName: \"kubernetes.io/projected/c6986b68-4a8d-4677-bed1-493eb1a231c3-kube-api-access-rnmft\") pod \"auto-csr-approver-29535346-hhrww\" (UID: \"c6986b68-4a8d-4677-bed1-493eb1a231c3\") " pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.658690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.707683 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.722684 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.724108 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.224093607 +0000 UTC m=+244.742655516 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.727884 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.748227 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tvpcl"] Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.754738 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.767125 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.816268 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.824083 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.824812 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.324778848 +0000 UTC m=+244.843340697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.824924 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z"] Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.869514 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.879772 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql"] Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.935211 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:21 crc kubenswrapper[4907]: E0226 15:46:21.935767 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.435752813 +0000 UTC m=+244.954314662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:21 crc kubenswrapper[4907]: I0226 15:46:21.939879 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44748: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.036798 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.036906 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.536881444 +0000 UTC m=+245.055443294 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.037865 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.038206 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.538194066 +0000 UTC m=+245.056755915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.053454 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44758: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: W0226 15:46:22.057350 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1e8a3f9_9de9_4181_869f_9fce597e6b5b.slice/crio-8827b2215c421ff5e15c1cd006fa84e99e835b98d562efd3b17bb927fe9f1f04 WatchSource:0}: Error finding container 8827b2215c421ff5e15c1cd006fa84e99e835b98d562efd3b17bb927fe9f1f04: Status 404 returned error can't find the container with id 8827b2215c421ff5e15c1cd006fa84e99e835b98d562efd3b17bb927fe9f1f04 Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.077517 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hs7mv"] Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.140262 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.140828 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.640802712 +0000 UTC m=+245.159364641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.148525 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44766: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.177432 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sjflz"] Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.230125 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dvcn5"] Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.243492 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.243826 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.743810328 +0000 UTC m=+245.262372177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.269999 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44782: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.364194 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8wmgt"] Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.368703 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.369034 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.869020111 +0000 UTC m=+245.387581960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.392927 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44790: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.435100 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6g628"] Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.446368 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wgl2p" podStartSLOduration=188.446351197 podStartE2EDuration="3m8.446351197s" podCreationTimestamp="2026-02-26 15:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:22.425469391 +0000 UTC m=+244.944031240" watchObservedRunningTime="2026-02-26 15:46:22.446351197 +0000 UTC m=+244.964913046" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.475499 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.475811 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:22.975801706 +0000 UTC m=+245.494363555 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.569086 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44804: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.583040 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.583450 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.083436982 +0000 UTC m=+245.601998831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.636708 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6g628" event={"ID":"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2","Type":"ContainerStarted","Data":"b35256c1ad9c8ff61e6a3416052b0ed9c2341645e405c7e640270fc5d4e094f8"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.687110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.687690 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.187679598 +0000 UTC m=+245.706241447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.773147 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44818: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.788850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" event={"ID":"1c8904fd-8dd8-418a-b32a-eb1ccf934fec","Type":"ContainerStarted","Data":"f8f8d926c5ea85c01486a46d6d4dfd3f4b9eb73e2acec602b1606b32feab134a"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.789464 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.789823 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.289806263 +0000 UTC m=+245.808368112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.808328 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wcgj6" event={"ID":"2e969445-2d6b-4ea1-bd4b-3473a66e8c91","Type":"ContainerStarted","Data":"2e28ba492c65a0ae026c9e5d907954ae3f09e135836fe39f9827eacacbd6009f"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.808371 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wcgj6" event={"ID":"2e969445-2d6b-4ea1-bd4b-3473a66e8c91","Type":"ContainerStarted","Data":"f5e979e0d732d246d701971319bc5ce590fb678c30052cb5ed95d9f9dc84c5b3"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.809218 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.814728 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.814785 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.837825 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" event={"ID":"ef4c8a6a-c008-406e-8aed-2164e582f710","Type":"ContainerStarted","Data":"adbfe8189c34e9ee8a05f32c2b41c633841772343b9f24a4d17888e5fba908f4"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.837863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" event={"ID":"ef4c8a6a-c008-406e-8aed-2164e582f710","Type":"ContainerStarted","Data":"1a924e9cddc25fd490ebb5fb9c5cb95b17986820b0023453c0b28ad3ac569e95"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.894976 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.899534 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.399513077 +0000 UTC m=+245.918074926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.918675 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hqs2t" event={"ID":"def12a12-3cf0-4694-a957-3e69aa18f880","Type":"ContainerStarted","Data":"3e88f3f5736c503c1fda2517aa5d2841e908e78f60760c1b9ce2dcfd41651470"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.954553 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44830: no serving certificate available for the kubelet" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.961363 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" podStartSLOduration=187.961348185 podStartE2EDuration="3m7.961348185s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:22.95902349 +0000 UTC m=+245.477585339" watchObservedRunningTime="2026-02-26 15:46:22.961348185 +0000 UTC m=+245.479910034" Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.969872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" event={"ID":"23df369e-238f-4fbc-99fa-b22c21011db0","Type":"ContainerStarted","Data":"463fce778766ab780cd80023770cbb1f0ce53f29756763e00f5d14a8a833939e"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.985628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sjflz" event={"ID":"e9aeee88-40a0-4c8a-aebf-680cf878f42e","Type":"ContainerStarted","Data":"c9416c40e6c9eef962652fdc7efde896e30d8a87ce1386a88a4c6e985e0dbbe5"} Feb 26 15:46:22 crc kubenswrapper[4907]: I0226 15:46:22.997486 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:22 crc kubenswrapper[4907]: E0226 15:46:22.998016 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.497986535 +0000 UTC m=+246.016548374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.007953 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" podStartSLOduration=189.007935071 podStartE2EDuration="3m9.007935071s" podCreationTimestamp="2026-02-26 15:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.00660169 +0000 UTC m=+245.525163539" watchObservedRunningTime="2026-02-26 15:46:23.007935071 +0000 UTC m=+245.526496920" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.013048 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9lx5z" event={"ID":"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f","Type":"ContainerStarted","Data":"be31115ab65a29efb189d2a6c53c2b7546be6f8e45a326225af6e1c0cb24b3d4"} Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.035250 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xvqkl" podStartSLOduration=188.03523363 podStartE2EDuration="3m8.03523363s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.034722148 +0000 UTC m=+245.553284007" watchObservedRunningTime="2026-02-26 15:46:23.03523363 +0000 UTC m=+245.553795479" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.041038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-k8lkj" event={"ID":"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf","Type":"ContainerStarted","Data":"a0abcedd01c76d59a8d8993200152b4bd3975e10a26b7b3b3e8721652066418a"} Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.081223 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" event={"ID":"2305f4ed-b155-4e30-b83c-7dde9bec7b28","Type":"ContainerStarted","Data":"57bf70f0755b2181577ddde904ef421d744052683878920e1a7661b7028b35d8"} Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.101342 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.101951 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.601935284 +0000 UTC m=+246.120497133 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.116827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" event={"ID":"f1e8a3f9-9de9-4181-869f-9fce597e6b5b","Type":"ContainerStarted","Data":"8827b2215c421ff5e15c1cd006fa84e99e835b98d562efd3b17bb927fe9f1f04"} Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.130267 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hs7mv" event={"ID":"f6e20f97-f90b-41d7-905e-f627e07b0dfb","Type":"ContainerStarted","Data":"0c929f89c25bfb82ea239a94c1d3ffb34e6be552acf456cf5050a50ef388a1c7"} Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.134907 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.181668 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" podStartSLOduration=189.181652456 podStartE2EDuration="3m9.181652456s" podCreationTimestamp="2026-02-26 15:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.181159885 +0000 UTC m=+245.699721734" watchObservedRunningTime="2026-02-26 15:46:23.181652456 +0000 UTC m=+245.700214295" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.204398 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.210381 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.710359878 +0000 UTC m=+246.228921727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.214031 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.226523 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44834: no serving certificate available for the kubelet" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.249781 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" podStartSLOduration=188.249759414 podStartE2EDuration="3m8.249759414s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.245481422 +0000 UTC m=+245.764043271" watchObservedRunningTime="2026-02-26 15:46:23.249759414 +0000 UTC m=+245.768321263" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.266410 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2f4gk" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.268455 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.318109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.318405 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.818394103 +0000 UTC m=+246.336955952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: W0226 15:46:23.320545 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod317291a5_1f7f_4d5a_8779_7c769dae2bc5.slice/crio-46cacb55d1bafcacdcc76f4771673058b361051b34ab80d5b5e18625d1bf6c0a WatchSource:0}: Error finding container 46cacb55d1bafcacdcc76f4771673058b361051b34ab80d5b5e18625d1bf6c0a: Status 404 returned error can't find the container with id 46cacb55d1bafcacdcc76f4771673058b361051b34ab80d5b5e18625d1bf6c0a Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.381154 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.421575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.421947 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:23.921934162 +0000 UTC m=+246.440496001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.520292 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" podStartSLOduration=188.520274777 podStartE2EDuration="3m8.520274777s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.494226899 +0000 UTC m=+246.012788748" watchObservedRunningTime="2026-02-26 15:46:23.520274777 +0000 UTC m=+246.038836616" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.523863 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.524200 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.02418896 +0000 UTC m=+246.542750809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.572739 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.574309 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-m7jwg" podStartSLOduration=188.57429369 podStartE2EDuration="3m8.57429369s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.568777369 +0000 UTC m=+246.087339208" watchObservedRunningTime="2026-02-26 15:46:23.57429369 +0000 UTC m=+246.092855529" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.624766 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.625313 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.12529181 +0000 UTC m=+246.643853649 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.671525 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hdqkt" podStartSLOduration=188.671509308 podStartE2EDuration="3m8.671509308s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.671336624 +0000 UTC m=+246.189898473" watchObservedRunningTime="2026-02-26 15:46:23.671509308 +0000 UTC m=+246.190071157" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.729541 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.729940 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.229924555 +0000 UTC m=+246.748486404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.788973 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.796637 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676"] Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.830224 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.830402 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.33037655 +0000 UTC m=+246.848938399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: W0226 15:46:23.830651 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb71208e6_41a8_44a3_a8fd_7380b1da6ffa.slice/crio-6bab9b6c0422ed74b5624fd032ceb9d67af2c78e42d399b26c095fdddd0c54e4 WatchSource:0}: Error finding container 6bab9b6c0422ed74b5624fd032ceb9d67af2c78e42d399b26c095fdddd0c54e4: Status 404 returned error can't find the container with id 6bab9b6c0422ed74b5624fd032ceb9d67af2c78e42d399b26c095fdddd0c54e4 Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.830724 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.831003 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.330996565 +0000 UTC m=+246.849558414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.863855 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wtjfv" podStartSLOduration=189.863833905 podStartE2EDuration="3m9.863833905s" podCreationTimestamp="2026-02-26 15:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:23.863094717 +0000 UTC m=+246.381656566" watchObservedRunningTime="2026-02-26 15:46:23.863833905 +0000 UTC m=+246.382395744" Feb 26 15:46:23 crc kubenswrapper[4907]: I0226 15:46:23.936052 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:23 crc kubenswrapper[4907]: E0226 15:46:23.936533 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.436519051 +0000 UTC m=+246.955080900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.033552 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.033605 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.037285 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.037636 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.53758013 +0000 UTC m=+247.056141969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.060731 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lknds" podStartSLOduration=189.060715159 podStartE2EDuration="3m9.060715159s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.047646659 +0000 UTC m=+246.566208508" watchObservedRunningTime="2026-02-26 15:46:24.060715159 +0000 UTC m=+246.579277008" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.076929 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fd8f2"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.122653 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wcgj6" podStartSLOduration=189.12264054 podStartE2EDuration="3m9.12264054s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.121416631 +0000 UTC m=+246.639978490" watchObservedRunningTime="2026-02-26 15:46:24.12264054 +0000 UTC m=+246.641202389" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.138521 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.138994 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.638973448 +0000 UTC m=+247.157535307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.154676 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.154710 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-fsndq"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.159365 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.239946 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.240771 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.740754065 +0000 UTC m=+247.259315914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.253970 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9lx5z" event={"ID":"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f","Type":"ContainerStarted","Data":"1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.270148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-k8lkj" event={"ID":"ae456f0d-bf77-4e93-9bf2-c47c27b8eadf","Type":"ContainerStarted","Data":"e5adcdf95eb9993856a579d62614d60a87663fddabce569cf40f22089db8edd6"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.275369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" event={"ID":"8b678693-5390-4ce1-bf51-a2da37343241","Type":"ContainerStarted","Data":"93ca5d5908f3c811ccc1d0c16e945d21cbf245f3a823065ecc534d43df47377c"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.276678 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" event={"ID":"1c8904fd-8dd8-418a-b32a-eb1ccf934fec","Type":"ContainerStarted","Data":"edc63303dd35ba29910660e93bd2b3adc72b3c4402b9080edc1dcf8322d438fe"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.277477 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" event={"ID":"6aa04e41-18ce-4928-b012-ae804b9cfafc","Type":"ContainerStarted","Data":"889def7d1777c9ad26612036782d19e8c799619d62eea714fa6bac72cf678645"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.309482 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-k8lkj" podStartSLOduration=6.309468436 podStartE2EDuration="6.309468436s" podCreationTimestamp="2026-02-26 15:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.184924659 +0000 UTC m=+246.703486508" watchObservedRunningTime="2026-02-26 15:46:24.309468436 +0000 UTC m=+246.828030285" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.309960 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9lx5z" podStartSLOduration=189.309956507 podStartE2EDuration="3m9.309956507s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.298097196 +0000 UTC m=+246.816659045" watchObservedRunningTime="2026-02-26 15:46:24.309956507 +0000 UTC m=+246.828518356" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.326849 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" event={"ID":"0dd74211-40c2-437c-9295-b69e709f81fe","Type":"ContainerStarted","Data":"9527d4a12e5d27011aa4fb6b2f87ba832fc3936ed0b9ac0122f3153e6afda18c"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.342492 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.343688 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.843669788 +0000 UTC m=+247.362231637 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.361069 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hs7mv" event={"ID":"f6e20f97-f90b-41d7-905e-f627e07b0dfb","Type":"ContainerStarted","Data":"4c48faac38d3e291465fa7394112c397c7b5ec0d9498edd5b33dfe3234090a13"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.364431 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.390993 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-4z8ql" podStartSLOduration=189.390969701 podStartE2EDuration="3m9.390969701s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.364211776 +0000 UTC m=+246.882773625" watchObservedRunningTime="2026-02-26 15:46:24.390969701 +0000 UTC m=+246.909531550" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.412811 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.446056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.447123 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:24.947109104 +0000 UTC m=+247.465670953 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.465639 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.525417 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-hqs2t" event={"ID":"def12a12-3cf0-4694-a957-3e69aa18f880","Type":"ContainerStarted","Data":"7fa6080fa88fe160fa51de84eeb946878aae050db7460d8d85456815213469fe"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.538479 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" event={"ID":"05322669-16de-41ca-9ae9-3580b5cdda05","Type":"ContainerStarted","Data":"5018aabde31d25095f6865d2140fed8a46781ca34ff57265ba582dd9997b5e20"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.547339 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.548399 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.048381359 +0000 UTC m=+247.566943208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.556547 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.559799 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hs7mv" podStartSLOduration=6.55978575 podStartE2EDuration="6.55978575s" podCreationTimestamp="2026-02-26 15:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.542865188 +0000 UTC m=+247.061427037" watchObservedRunningTime="2026-02-26 15:46:24.55978575 +0000 UTC m=+247.078347599" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.562247 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.562292 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.575110 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sjflz" event={"ID":"e9aeee88-40a0-4c8a-aebf-680cf878f42e","Type":"ContainerStarted","Data":"1d5d2d22f0d22d836ad665af97a8dd9985c2c0b7baecfc0e4b6c4b3046ec26d3"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.575873 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.585925 4907 patch_prober.go:28] interesting pod/console-operator-58897d9998-sjflz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.585972 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sjflz" podUID="e9aeee88-40a0-4c8a-aebf-680cf878f42e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.613842 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-hqs2t" podStartSLOduration=189.613826943 podStartE2EDuration="3m9.613826943s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.612066851 +0000 UTC m=+247.130628710" watchObservedRunningTime="2026-02-26 15:46:24.613826943 +0000 UTC m=+247.132388792" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.614973 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" event={"ID":"b71208e6-41a8-44a3-a8fd-7380b1da6ffa","Type":"ContainerStarted","Data":"6bab9b6c0422ed74b5624fd032ceb9d67af2c78e42d399b26c095fdddd0c54e4"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.619572 4907 ???:1] "http: TLS handshake error from 192.168.126.11:44840: no serving certificate available for the kubelet" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.619834 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" event={"ID":"317291a5-1f7f-4d5a-8779-7c769dae2bc5","Type":"ContainerStarted","Data":"46cacb55d1bafcacdcc76f4771673058b361051b34ab80d5b5e18625d1bf6c0a"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.651073 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.659621 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.15960726 +0000 UTC m=+247.678169109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.676768 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-hhrww"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.691732 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z5rgk"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.692968 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sjflz" podStartSLOduration=189.692952682 podStartE2EDuration="3m9.692952682s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.687150124 +0000 UTC m=+247.205711973" watchObservedRunningTime="2026-02-26 15:46:24.692952682 +0000 UTC m=+247.211514531" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.714452 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.715815 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.715854 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 26 15:46:24 crc kubenswrapper[4907]: W0226 15:46:24.737979 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ec425f0_76a0_445f_8d38_a4f125da3312.slice/crio-fcca8da5978c58e3b57dc4e232e4fecae52f8c81cd114bf7939895027b037086 WatchSource:0}: Error finding container fcca8da5978c58e3b57dc4e232e4fecae52f8c81cd114bf7939895027b037086: Status 404 returned error can't find the container with id fcca8da5978c58e3b57dc4e232e4fecae52f8c81cd114bf7939895027b037086 Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.761877 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.762308 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.262292798 +0000 UTC m=+247.780854647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.771883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" event={"ID":"f1e8a3f9-9de9-4181-869f-9fce597e6b5b","Type":"ContainerStarted","Data":"774f3b8a8f824ea873d3e2307b652ee7f5c63e5e0252c27caf9ab68f25f1f109"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.803078 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" event={"ID":"1c26ef74-f7b8-4cc3-ae04-783bfa2b38b4","Type":"ContainerStarted","Data":"9cf8e67ed24e00c0bca0fb86264b315592f016462919b99fd95adf8e904ee95c"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.831091 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" event={"ID":"af8aa9df-432b-40bd-847c-c3539b32cb59","Type":"ContainerStarted","Data":"70fc08e7327c151750a19cc75b907a37869b9b43fac6c83b5d20c2be8ced14a4"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.859089 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" event={"ID":"ed605a31-991f-4fcc-a861-3bfe94c7b92c","Type":"ContainerStarted","Data":"e307e51e748f52976dd7773e0d6735de319a0a7418c8764a037bc3b64878e56b"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.864638 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.866516 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.366500823 +0000 UTC m=+247.885062672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.904814 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" podStartSLOduration=189.904796452 podStartE2EDuration="3m9.904796452s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.882538083 +0000 UTC m=+247.401099932" watchObservedRunningTime="2026-02-26 15:46:24.904796452 +0000 UTC m=+247.423358301" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.907365 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-l5fqj"] Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.950690 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" event={"ID":"23df369e-238f-4fbc-99fa-b22c21011db0","Type":"ContainerStarted","Data":"696e6ee06370721d0f2fc0767f48826636ddd1416581a40026f5923f1f382ca8"} Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.956152 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.956206 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:24 crc kubenswrapper[4907]: I0226 15:46:24.965930 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:24 crc kubenswrapper[4907]: E0226 15:46:24.966215 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.46620142 +0000 UTC m=+247.984763269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.009690 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.010027 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.030246 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" podStartSLOduration=190.030228181 podStartE2EDuration="3m10.030228181s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:25.028839038 +0000 UTC m=+247.547400887" watchObservedRunningTime="2026-02-26 15:46:25.030228181 +0000 UTC m=+247.548790030" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.031116 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" podStartSLOduration=190.031111511 podStartE2EDuration="3m10.031111511s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:24.945424227 +0000 UTC m=+247.463986076" watchObservedRunningTime="2026-02-26 15:46:25.031111511 +0000 UTC m=+247.549673360" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.071748 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.075788 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" podStartSLOduration=190.075775262 podStartE2EDuration="3m10.075775262s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:25.073534519 +0000 UTC m=+247.592096358" watchObservedRunningTime="2026-02-26 15:46:25.075775262 +0000 UTC m=+247.594337111" Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.079313 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.579300016 +0000 UTC m=+248.097861865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.119037 4907 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5tc4m container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]log ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]etcd ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/generic-apiserver-start-informers ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/max-in-flight-filter ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/image.openshift.io-apiserver-caches ok Feb 26 15:46:25 crc kubenswrapper[4907]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Feb 26 15:46:25 crc kubenswrapper[4907]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/project.openshift.io-projectcache ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-startinformers ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/openshift.io-restmapperupdater ok Feb 26 15:46:25 crc kubenswrapper[4907]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 26 15:46:25 crc kubenswrapper[4907]: livez check failed Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.119111 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" podUID="f1a111d0-85de-4328-90ac-9b9af3edbc49" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.173931 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.174189 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.674175659 +0000 UTC m=+248.192737508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.275074 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.275373 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.775362421 +0000 UTC m=+248.293924270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.375852 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.376721 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.876706827 +0000 UTC m=+248.395268676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.481625 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.481969 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:25.981954436 +0000 UTC m=+248.500516285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.589087 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.589721 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.089707195 +0000 UTC m=+248.608269044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.692573 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.692927 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.192916616 +0000 UTC m=+248.711478465 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.720809 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:25 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:25 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:25 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.720887 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.793892 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.794170 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.294143669 +0000 UTC m=+248.812705518 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.794331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.794644 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.294637511 +0000 UTC m=+248.813199360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.894863 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.895120 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.395091986 +0000 UTC m=+248.913653825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.895452 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.895742 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.395731372 +0000 UTC m=+248.914293221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.979007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" event={"ID":"2ec425f0-76a0-445f-8d38-a4f125da3312","Type":"ContainerStarted","Data":"8ac68e1f3a905fc32665741aaba3d688936f057701d01e4e219004fb8771b1ed"} Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.979052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" event={"ID":"2ec425f0-76a0-445f-8d38-a4f125da3312","Type":"ContainerStarted","Data":"fcca8da5978c58e3b57dc4e232e4fecae52f8c81cd114bf7939895027b037086"} Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.980133 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.985036 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6g628" event={"ID":"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2","Type":"ContainerStarted","Data":"469271f341a8ae2fd768a1b30769330da6b7df256c6d2c86bfa7b4f81dd81f2a"} Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.985067 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6g628" event={"ID":"6e6248c0-ae25-48db-9112-ffeb9f9ca6a2","Type":"ContainerStarted","Data":"db7142aeaf8bd0e493ce280ec1f9a80e46f3aad72ba8affb58b14a8d5f9e895d"} Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.985519 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6g628" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.986843 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" event={"ID":"b71208e6-41a8-44a3-a8fd-7380b1da6ffa","Type":"ContainerStarted","Data":"25f438e7a55620a8360234a6b75f322cc61369fe481d17075620d63a32c791eb"} Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.986868 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" event={"ID":"b71208e6-41a8-44a3-a8fd-7380b1da6ffa","Type":"ContainerStarted","Data":"66dbe54544e79a747aab4481d912a591d7195fa2c635108c81ec95d1b63dfa1b"} Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.994086 4907 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-z86sf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" start-of-body= Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.994155 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" podUID="2ec425f0-76a0-445f-8d38-a4f125da3312" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.36:8443/healthz\": dial tcp 10.217.0.36:8443: connect: connection refused" Feb 26 15:46:25 crc kubenswrapper[4907]: I0226 15:46:25.995875 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:25 crc kubenswrapper[4907]: E0226 15:46:25.996525 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.496497394 +0000 UTC m=+249.015059243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.029806 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" podStartSLOduration=191.029792045 podStartE2EDuration="3m11.029792045s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.028808562 +0000 UTC m=+248.547370411" watchObservedRunningTime="2026-02-26 15:46:26.029792045 +0000 UTC m=+248.548353884" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.038951 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" event={"ID":"317291a5-1f7f-4d5a-8779-7c769dae2bc5","Type":"ContainerStarted","Data":"eccc381205057d2bc23ad336901264b7aafc66203a1feb3a0f9afb40a17d4ab0"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.038980 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" event={"ID":"317291a5-1f7f-4d5a-8779-7c769dae2bc5","Type":"ContainerStarted","Data":"a30892ec1152575e90b20559fab0fcbd4d2a359d1a9c98a93628c250f21b124a"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.055225 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" event={"ID":"8b678693-5390-4ce1-bf51-a2da37343241","Type":"ContainerStarted","Data":"e94e73531f5fae3596dcf15a237d9e50e0b5fc7c1fee2d7291ec88ec8675436e"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.064325 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" event={"ID":"6aa04e41-18ce-4928-b012-ae804b9cfafc","Type":"ContainerStarted","Data":"914774c1b697598a59d81d603883ab696bcf8b3d6ca19ebd3af664ea6c3ea67b"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.097792 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.098235 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.59821929 +0000 UTC m=+249.116781139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.098790 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" event={"ID":"101ef487-124a-40ce-bf7d-8b7efcab6765","Type":"ContainerStarted","Data":"253238aecce478e81a8e6371ecc03a683f8391e7fca19c34694c784c0f3521ee"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.098845 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" event={"ID":"101ef487-124a-40ce-bf7d-8b7efcab6765","Type":"ContainerStarted","Data":"6cdf611ddb79e728fd02e5b5bc656caf4c10620293dbfea01f9c59c590d6a918"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.105008 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-hhrww" event={"ID":"c6986b68-4a8d-4677-bed1-493eb1a231c3","Type":"ContainerStarted","Data":"f84d2a2512a848bb2ee926d0b7687b3477982a665dff70d90763444b7b73d1ea"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.113708 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8wmgt" event={"ID":"ed605a31-991f-4fcc-a861-3bfe94c7b92c","Type":"ContainerStarted","Data":"04c979b06635a2a26df4df88a00ebd13f779efbe116a8b852b50601e20af8e43"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.127150 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rsw5p" podStartSLOduration=191.127131836 podStartE2EDuration="3m11.127131836s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.069848956 +0000 UTC m=+248.588410805" watchObservedRunningTime="2026-02-26 15:46:26.127131836 +0000 UTC m=+248.645693675" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.150997 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" event={"ID":"2a0a1c34-d485-449a-86c9-8c4631a023b5","Type":"ContainerStarted","Data":"c91a0fe76e11b9ee4dbe5f246b69e5fa56dfa4aa2d85e5efd4f647cec82cb828"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.151047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" event={"ID":"2a0a1c34-d485-449a-86c9-8c4631a023b5","Type":"ContainerStarted","Data":"5623e824f33eb93a1db6f6a466c2e740d88d19667faa6b5e4f055a6ca08251a7"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.151145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" event={"ID":"14efe72b-80f8-4748-bcc9-e4f20a7eb28e","Type":"ContainerStarted","Data":"89187fa769e0c1726a735135338eefe2e1a0716f4d926873b59890d89584a44f"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.151190 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" event={"ID":"14efe72b-80f8-4748-bcc9-e4f20a7eb28e","Type":"ContainerStarted","Data":"2f3505cf2ca81fb11e9718af2402f222c700d762cd6307de8002229b4f081d91"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.155969 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" event={"ID":"87018111-567b-4b30-a141-f20b606728e9","Type":"ContainerStarted","Data":"e6cf2ee691d1f69a16d9d3238c07fc31827db55cbb89e10b5d61f73385956715"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.156044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" event={"ID":"87018111-567b-4b30-a141-f20b606728e9","Type":"ContainerStarted","Data":"1dee31c3329fc5ed64c11a8cd7ded20a9356773bb8e408f3250d4b9957be2503"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.163855 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6g628" podStartSLOduration=8.163839807 podStartE2EDuration="8.163839807s" podCreationTimestamp="2026-02-26 15:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.129078573 +0000 UTC m=+248.647640422" watchObservedRunningTime="2026-02-26 15:46:26.163839807 +0000 UTC m=+248.682401656" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.165089 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-hztww" podStartSLOduration=191.165082717 podStartE2EDuration="3m11.165082717s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.164921153 +0000 UTC m=+248.683483012" watchObservedRunningTime="2026-02-26 15:46:26.165082717 +0000 UTC m=+248.683644566" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.199208 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.199500 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.199550 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.199614 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.199649 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.200476 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.700462098 +0000 UTC m=+249.219023947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.201781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-fsndq" event={"ID":"1b0532e1-9350-435d-bb1f-72bb0931a2e8","Type":"ContainerStarted","Data":"047527ef54d878e044c11075aca9d5dfdd97144eeef7800d6cc1101ee73d4379"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.202953 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.203134 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.203280 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.213028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.217680 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.221344 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.221547 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" event={"ID":"a766dd26-3d8c-464c-b873-f03d3895b9d1","Type":"ContainerStarted","Data":"48178b5a99284d766dc5af56f6c3a561707216bf07b28bdf05e20ca4c1dd166f"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.221618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" event={"ID":"a766dd26-3d8c-464c-b873-f03d3895b9d1","Type":"ContainerStarted","Data":"07451e7d52c5a8f1d13ab844eaac9632f5b790303a7c65bdf1b1e592c7f739fe"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.222370 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.225996 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.236054 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.239512 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" event={"ID":"70ad9c23-ce1d-4b1a-979d-08d20761353e","Type":"ContainerStarted","Data":"625b11e2373c767b0c48a254f491e90e43f3c71303a2dd4c7f609d0b91bc35ae"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.239548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" event={"ID":"70ad9c23-ce1d-4b1a-979d-08d20761353e","Type":"ContainerStarted","Data":"327d92a300b05342e72380777975961bf516630846aca17ef093e3bb59055147"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.243096 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.264928 4907 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sw4qw container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.264988 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" podUID="a766dd26-3d8c-464c-b873-f03d3895b9d1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.274287 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" event={"ID":"0dd74211-40c2-437c-9295-b69e709f81fe","Type":"ContainerStarted","Data":"dbb0a17c19b0ecd0029d1ab15137ff5e45d1ec47832ed90912f8f2b1f23fb7d1"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.300606 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.301670 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.80165614 +0000 UTC m=+249.320217989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.311874 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" event={"ID":"2305f4ed-b155-4e30-b83c-7dde9bec7b28","Type":"ContainerStarted","Data":"6c1fc4e2a1bba2e63ce230d7fabebd9671228da90242d20e8aeaa5e9bf201936"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.319250 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" event={"ID":"80e13006-b114-4c3f-8669-62afc695914b","Type":"ContainerStarted","Data":"d780b7ee2fd53e0626777a3f4ec705bcb94917d0505a72df1a6dea28c2e9c40f"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.319309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" event={"ID":"80e13006-b114-4c3f-8669-62afc695914b","Type":"ContainerStarted","Data":"6f0c64387c61dbba61fc875c0e910e7e5513c95a58e12ad6d3e49256a7d641b5"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.319499 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k2mmn" podStartSLOduration=191.319485494 podStartE2EDuration="3m11.319485494s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.250966717 +0000 UTC m=+248.769528566" watchObservedRunningTime="2026-02-26 15:46:26.319485494 +0000 UTC m=+248.838047343" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.339961 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-w9nx4" event={"ID":"af8aa9df-432b-40bd-847c-c3539b32cb59","Type":"ContainerStarted","Data":"c2abc27cd4f584d109f6842e09691001084f47750447793d53c6a68488651ce6"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.373470 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" event={"ID":"f1e8a3f9-9de9-4181-869f-9fce597e6b5b","Type":"ContainerStarted","Data":"a7b5401101f0b2eae7765c9b82df664e3c75eeaad46bf75a47fb355e2b0cd0f8"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.374318 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-4z9rn" podStartSLOduration=191.374308976 podStartE2EDuration="3m11.374308976s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.372221215 +0000 UTC m=+248.890783064" watchObservedRunningTime="2026-02-26 15:46:26.374308976 +0000 UTC m=+248.892870815" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.374474 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-ks676" podStartSLOduration=191.374467309 podStartE2EDuration="3m11.374467309s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.340118873 +0000 UTC m=+248.858680722" watchObservedRunningTime="2026-02-26 15:46:26.374467309 +0000 UTC m=+248.893029158" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.389962 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" event={"ID":"d01c15cd-3103-49df-afdd-e6f6d6f35716","Type":"ContainerStarted","Data":"b86c3e9396f72191e86f24aaf82ebd038730a1f1b09e307367174239811dcdb2"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.398715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" event={"ID":"73dbdcc4-3b7d-4593-a8e1-b15f824b5670","Type":"ContainerStarted","Data":"e4b8764b501cdd3ee8e591a805852d2476ffa0bc383605aed6d17a8342d40b88"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.398759 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" event={"ID":"73dbdcc4-3b7d-4593-a8e1-b15f824b5670","Type":"ContainerStarted","Data":"7c03a862a5d17647f9ff0acc36de4a720c3317a30ec7c43210464de169caed33"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.399930 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.402129 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.403223 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:26.903205101 +0000 UTC m=+249.421766950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.414355 4907 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z6m64 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" start-of-body= Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.414408 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" podUID="73dbdcc4-3b7d-4593-a8e1-b15f824b5670" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": dial tcp 10.217.0.33:5443: connect: connection refused" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.432432 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8fhkw" podStartSLOduration=191.432414055 podStartE2EDuration="3m11.432414055s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.431103054 +0000 UTC m=+248.949664903" watchObservedRunningTime="2026-02-26 15:46:26.432414055 +0000 UTC m=+248.950975894" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.447830 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.449346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" event={"ID":"05322669-16de-41ca-9ae9-3580b5cdda05","Type":"ContainerStarted","Data":"4f4a07d88f7162cdbcea246a84ca501bcb7337f842fd3295df48270a4af5b320"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.449379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" event={"ID":"05322669-16de-41ca-9ae9-3580b5cdda05","Type":"ContainerStarted","Data":"fe760377e01aa7ccf6a3f1282f45976bf055ce9a85b30b0f342738fd1d37279e"} Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.450978 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.451896 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.451933 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.452072 4907 patch_prober.go:28] interesting pod/console-operator-58897d9998-sjflz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.452117 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sjflz" podUID="e9aeee88-40a0-4c8a-aebf-680cf878f42e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.469950 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.470537 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dvcn5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.470570 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.471801 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" podStartSLOduration=191.47178582 podStartE2EDuration="3m11.47178582s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.469448645 +0000 UTC m=+248.988010484" watchObservedRunningTime="2026-02-26 15:46:26.47178582 +0000 UTC m=+248.990347659" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.474800 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.493940 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-vnrdg" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.504081 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.507712 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.007700513 +0000 UTC m=+249.526262362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.607182 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.608639 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.108622969 +0000 UTC m=+249.627184818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.709575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.709948 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.209936374 +0000 UTC m=+249.728498223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.716852 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:26 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:26 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:26 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.716907 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.810273 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.810570 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.310555364 +0000 UTC m=+249.829117213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.812306 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" podStartSLOduration=191.812278345 podStartE2EDuration="3m11.812278345s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.811038836 +0000 UTC m=+249.329600685" watchObservedRunningTime="2026-02-26 15:46:26.812278345 +0000 UTC m=+249.330840194" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.812453 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-gvx2f" podStartSLOduration=191.812449669 podStartE2EDuration="3m11.812449669s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.71856498 +0000 UTC m=+249.237126829" watchObservedRunningTime="2026-02-26 15:46:26.812449669 +0000 UTC m=+249.331011518" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.880530 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" podStartSLOduration=86.880511365 podStartE2EDuration="1m26.880511365s" podCreationTimestamp="2026-02-26 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.876809387 +0000 UTC m=+249.395371236" watchObservedRunningTime="2026-02-26 15:46:26.880511365 +0000 UTC m=+249.399073214" Feb 26 15:46:26 crc kubenswrapper[4907]: I0226 15:46:26.917476 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:26 crc kubenswrapper[4907]: E0226 15:46:26.917829 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.417817501 +0000 UTC m=+249.936379350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.018918 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.019160 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.519143546 +0000 UTC m=+250.037705395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.019343 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.019661 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.519653899 +0000 UTC m=+250.038215748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.046733 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z5rgk" podStartSLOduration=192.046714831 podStartE2EDuration="3m12.046714831s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:26.953723104 +0000 UTC m=+249.472284953" watchObservedRunningTime="2026-02-26 15:46:27.046714831 +0000 UTC m=+249.565276680" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.046936 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnh8z" podStartSLOduration=192.046933136 podStartE2EDuration="3m12.046933136s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:27.045974613 +0000 UTC m=+249.564536462" watchObservedRunningTime="2026-02-26 15:46:27.046933136 +0000 UTC m=+249.565494985" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.103033 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" podStartSLOduration=192.103018518 podStartE2EDuration="3m12.103018518s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:27.10097372 +0000 UTC m=+249.619535569" watchObservedRunningTime="2026-02-26 15:46:27.103018518 +0000 UTC m=+249.621580367" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.120782 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.120926 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.620905663 +0000 UTC m=+250.139467512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.121030 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.121297 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.621290072 +0000 UTC m=+250.139851921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.189266 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gtnvm" podStartSLOduration=192.189248506 podStartE2EDuration="3m12.189248506s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:27.178695935 +0000 UTC m=+249.697257784" watchObservedRunningTime="2026-02-26 15:46:27.189248506 +0000 UTC m=+249.707810355" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.222319 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.222709 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.72269432 +0000 UTC m=+250.241256169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.278275 4907 ???:1] "http: TLS handshake error from 192.168.126.11:33958: no serving certificate available for the kubelet" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.323886 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.323961 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.324295 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.824284882 +0000 UTC m=+250.342846731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.327503 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.341105 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fd06f422-2c09-4da9-843c-75525df52517-metrics-certs\") pod \"network-metrics-daemon-zsb5l\" (UID: \"fd06f422-2c09-4da9-843c-75525df52517\") " pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.357693 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.366633 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zsb5l" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.424735 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.424999 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:27.924984323 +0000 UTC m=+250.443546162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.530601 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" event={"ID":"d01c15cd-3103-49df-afdd-e6f6d6f35716","Type":"ContainerStarted","Data":"05e67ba3afbf5b089a5840b45aa0e491809c0791021da53b9a08bf0a1648623a"} Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.534170 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.534452 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.034441682 +0000 UTC m=+250.553003521 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.539680 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" event={"ID":"70ad9c23-ce1d-4b1a-979d-08d20761353e","Type":"ContainerStarted","Data":"dfaeae2fdeae0feb3a6a1063b6580e00181fafbcacfaab5ac24d8fd99720a6e6"} Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.540602 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.559302 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" event={"ID":"2a0a1c34-d485-449a-86c9-8c4631a023b5","Type":"ContainerStarted","Data":"f7f42335d4224727aa30a67d8721189d0bfe6c713e740f756993a2c3e895cec5"} Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.574115 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" podStartSLOduration=192.574097644 podStartE2EDuration="3m12.574097644s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:27.571613415 +0000 UTC m=+250.090175264" watchObservedRunningTime="2026-02-26 15:46:27.574097644 +0000 UTC m=+250.092659493" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.588840 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tvpcl" event={"ID":"2305f4ed-b155-4e30-b83c-7dde9bec7b28","Type":"ContainerStarted","Data":"8a2063a586d2154afa1722c0429ae6b564e5280d9a949ffafc7a27df8b0986ac"} Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.592713 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dvcn5 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.592780 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.610303 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd8f2" podStartSLOduration=192.610287583 podStartE2EDuration="3m12.610287583s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:27.608995253 +0000 UTC m=+250.127557102" watchObservedRunningTime="2026-02-26 15:46:27.610287583 +0000 UTC m=+250.128849432" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.616580 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-z86sf" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.616772 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sw4qw" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.641025 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.642567 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.142552579 +0000 UTC m=+250.661114428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.721467 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:27 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:27 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:27 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.721532 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.746879 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.748307 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.24829526 +0000 UTC m=+250.766857109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.848324 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.848692 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.348678814 +0000 UTC m=+250.867240663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:27 crc kubenswrapper[4907]: I0226 15:46:27.949604 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:27 crc kubenswrapper[4907]: E0226 15:46:27.949921 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.449908947 +0000 UTC m=+250.968470796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.050344 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.050650 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.550635909 +0000 UTC m=+251.069197758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.156450 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.156801 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.65678886 +0000 UTC m=+251.175350709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.257544 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.257965 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.757949342 +0000 UTC m=+251.276511191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.358537 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.358829 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.858818597 +0000 UTC m=+251.377380446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.460146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.460666 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:28.960638835 +0000 UTC m=+251.479200684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.561517 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.561934 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.06192295 +0000 UTC m=+251.580484789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.590697 4907 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z6m64 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.590761 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" podUID="73dbdcc4-3b7d-4593-a8e1-b15f824b5670" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.590937 4907 patch_prober.go:28] interesting pod/console-operator-58897d9998-sjflz container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.591001 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sjflz" podUID="e9aeee88-40a0-4c8a-aebf-680cf878f42e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.620892 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"39eae1fa953d3a3e039c840ce19589d14e7841f807bde47dec82917183777b49"} Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.628749 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fbfc95121a354d35ff3447c37af1ba563829649ce0c99216fc35ec39050101d2"} Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.628794 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"10c73a5965cbc67cc128611b59d5bc6ad52512ab323ebd8f67ebbf173f6700e3"} Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.630859 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.638892 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zsb5l"] Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.663779 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.664118 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.164103856 +0000 UTC m=+251.682665705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: W0226 15:46:28.666232 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd06f422_2c09_4da9_843c_75525df52517.slice/crio-b1e62c936c68f0f27b9e0c8f4064c914f6c2c00b3481a116c2502fc92987736f WatchSource:0}: Error finding container b1e62c936c68f0f27b9e0c8f4064c914f6c2c00b3481a116c2502fc92987736f: Status 404 returned error can't find the container with id b1e62c936c68f0f27b9e0c8f4064c914f6c2c00b3481a116c2502fc92987736f Feb 26 15:46:28 crc kubenswrapper[4907]: W0226 15:46:28.669119 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c30c152b8b1168f144708cfbcb1d744bd0432ded21f471cd694e437b09edcc2b WatchSource:0}: Error finding container c30c152b8b1168f144708cfbcb1d744bd0432ded21f471cd694e437b09edcc2b: Status 404 returned error can't find the container with id c30c152b8b1168f144708cfbcb1d744bd0432ded21f471cd694e437b09edcc2b Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.712838 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:28 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:28 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:28 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.712907 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.767714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.768083 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.268072894 +0000 UTC m=+251.786634743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.854803 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p9vbb"] Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.855047 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" containerName="controller-manager" containerID="cri-o://1b0eb3c56ccd014ace15a0c56f6e7ca89dca71b83fe0c0c42006cc66ad1f972c" gracePeriod=30 Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.868733 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.869322 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.369307519 +0000 UTC m=+251.887869368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:28 crc kubenswrapper[4907]: I0226 15:46:28.970198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:28 crc kubenswrapper[4907]: E0226 15:46:28.970952 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.470940401 +0000 UTC m=+251.989502250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.028692 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm"] Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.028909 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerName="route-controller-manager" containerID="cri-o://d2644c3d16f2880f068ed3473b9f1e9b0826ed05a9392b6c9676ae3feabfd916" gracePeriod=30 Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.055102 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.070860 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.071197 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.571183622 +0000 UTC m=+252.089745461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.090114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5tc4m" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.172076 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.173385 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.673373448 +0000 UTC m=+252.191935297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.204556 4907 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-96swm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.204634 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.245002 4907 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-p9vbb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.245072 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.273410 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.273786 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.773760402 +0000 UTC m=+252.292322251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.374634 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.375000 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.874986306 +0000 UTC m=+252.393548155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.475841 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.476245 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:29.976224419 +0000 UTC m=+252.494786268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.577238 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.577511 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.077500274 +0000 UTC m=+252.596062113 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.630217 4907 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z6m64 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.630494 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" podUID="73dbdcc4-3b7d-4593-a8e1-b15f824b5670" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.33:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.660876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"470eb23c5b4229123ef10595c11150d922e82137b76e011626fdb1ebb3abacfd"} Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.660922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c30c152b8b1168f144708cfbcb1d744bd0432ded21f471cd694e437b09edcc2b"} Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.674729 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tqxjz"] Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.675760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.679856 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.679923 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.680259 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.180246654 +0000 UTC m=+252.698808503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.723148 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tqxjz"] Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.732461 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:29 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:29 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:29 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.732519 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.748539 4907 generic.go:334] "Generic (PLEG): container finished" podID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerID="d2644c3d16f2880f068ed3473b9f1e9b0826ed05a9392b6c9676ae3feabfd916" exitCode=0 Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.748683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" event={"ID":"54942a44-6e66-4757-8106-bbe836a2d8ca","Type":"ContainerDied","Data":"d2644c3d16f2880f068ed3473b9f1e9b0826ed05a9392b6c9676ae3feabfd916"} Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.779716 4907 generic.go:334] "Generic (PLEG): container finished" podID="bbef2e1f-1be1-4624-804c-45892231df1e" containerID="1b0eb3c56ccd014ace15a0c56f6e7ca89dca71b83fe0c0c42006cc66ad1f972c" exitCode=0 Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.779787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" event={"ID":"bbef2e1f-1be1-4624-804c-45892231df1e","Type":"ContainerDied","Data":"1b0eb3c56ccd014ace15a0c56f6e7ca89dca71b83fe0c0c42006cc66ad1f972c"} Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.781289 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.781392 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmcgc\" (UniqueName: \"kubernetes.io/projected/e0e96b15-45f7-47f1-878e-57914ef18916-kube-api-access-xmcgc\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.781465 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-utilities\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.781530 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-catalog-content\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.781776 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.281757744 +0000 UTC m=+252.800319633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.793278 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f3a100767b065be27f7dbe023107bf2c39b86d182e8fa93e2f2c80d146f06db3"} Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.794978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" event={"ID":"fd06f422-2c09-4da9-843c-75525df52517","Type":"ContainerStarted","Data":"b1e62c936c68f0f27b9e0c8f4064c914f6c2c00b3481a116c2502fc92987736f"} Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.834562 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-22zr8"] Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.835507 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.839198 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.869002 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22zr8"] Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.882125 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.882304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmcgc\" (UniqueName: \"kubernetes.io/projected/e0e96b15-45f7-47f1-878e-57914ef18916-kube-api-access-xmcgc\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.882392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-utilities\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.882510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-catalog-content\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.883922 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.383900059 +0000 UTC m=+252.902461908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.885469 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-catalog-content\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.885556 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-utilities\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.938311 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmcgc\" (UniqueName: \"kubernetes.io/projected/e0e96b15-45f7-47f1-878e-57914ef18916-kube-api-access-xmcgc\") pod \"certified-operators-tqxjz\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.983476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-catalog-content\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.983530 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pwjf\" (UniqueName: \"kubernetes.io/projected/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-kube-api-access-8pwjf\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.983558 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.983602 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-utilities\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:29 crc kubenswrapper[4907]: E0226 15:46:29.983891 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.483880084 +0000 UTC m=+253.002441933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.992730 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qhfr7"] Feb 26 15:46:29 crc kubenswrapper[4907]: I0226 15:46:29.993624 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.003956 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.043387 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qhfr7"] Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.084748 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.084936 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-catalog-content\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.084988 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-catalog-content\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.085019 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pwjf\" (UniqueName: \"kubernetes.io/projected/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-kube-api-access-8pwjf\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.085077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4g5s\" (UniqueName: \"kubernetes.io/projected/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-kube-api-access-j4g5s\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.085111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-utilities\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.085141 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-utilities\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.085259 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.58524107 +0000 UTC m=+253.103802919 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.085727 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-catalog-content\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.088514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-utilities\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.104557 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.165075 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pwjf\" (UniqueName: \"kubernetes.io/projected/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-kube-api-access-8pwjf\") pod \"community-operators-22zr8\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.186111 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2v8kx"] Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.186224 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-config\") pod \"bbef2e1f-1be1-4624-804c-45892231df1e\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.186258 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-proxy-ca-bundles\") pod \"bbef2e1f-1be1-4624-804c-45892231df1e\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.186291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdtmp\" (UniqueName: \"kubernetes.io/projected/bbef2e1f-1be1-4624-804c-45892231df1e-kube-api-access-vdtmp\") pod \"bbef2e1f-1be1-4624-804c-45892231df1e\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.186309 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbef2e1f-1be1-4624-804c-45892231df1e-serving-cert\") pod \"bbef2e1f-1be1-4624-804c-45892231df1e\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.186516 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-client-ca\") pod \"bbef2e1f-1be1-4624-804c-45892231df1e\" (UID: \"bbef2e1f-1be1-4624-804c-45892231df1e\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.187296 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-config" (OuterVolumeSpecName: "config") pod "bbef2e1f-1be1-4624-804c-45892231df1e" (UID: "bbef2e1f-1be1-4624-804c-45892231df1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.189228 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" containerName="controller-manager" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.189247 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" containerName="controller-manager" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.189336 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" containerName="controller-manager" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.190083 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.190911 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-client-ca" (OuterVolumeSpecName: "client-ca") pod "bbef2e1f-1be1-4624-804c-45892231df1e" (UID: "bbef2e1f-1be1-4624-804c-45892231df1e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.190937 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bbef2e1f-1be1-4624-804c-45892231df1e" (UID: "bbef2e1f-1be1-4624-804c-45892231df1e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.194940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-utilities\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195071 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-catalog-content\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195141 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195182 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4g5s\" (UniqueName: \"kubernetes.io/projected/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-kube-api-access-j4g5s\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195239 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195249 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195258 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bbef2e1f-1be1-4624-804c-45892231df1e-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195662 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-catalog-content\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.195746 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.695735924 +0000 UTC m=+253.214297773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.195870 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-utilities\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.211166 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbef2e1f-1be1-4624-804c-45892231df1e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bbef2e1f-1be1-4624-804c-45892231df1e" (UID: "bbef2e1f-1be1-4624-804c-45892231df1e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.213792 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v8kx"] Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.215184 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbef2e1f-1be1-4624-804c-45892231df1e-kube-api-access-vdtmp" (OuterVolumeSpecName: "kube-api-access-vdtmp") pod "bbef2e1f-1be1-4624-804c-45892231df1e" (UID: "bbef2e1f-1be1-4624-804c-45892231df1e"). InnerVolumeSpecName "kube-api-access-vdtmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.243344 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4g5s\" (UniqueName: \"kubernetes.io/projected/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-kube-api-access-j4g5s\") pod \"certified-operators-qhfr7\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.291807 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.295966 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.296120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-utilities\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.296190 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.796170229 +0000 UTC m=+253.314732078 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.296230 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-catalog-content\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.296322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqtzd\" (UniqueName: \"kubernetes.io/projected/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-kube-api-access-rqtzd\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.296381 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.296453 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdtmp\" (UniqueName: \"kubernetes.io/projected/bbef2e1f-1be1-4624-804c-45892231df1e-kube-api-access-vdtmp\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.296469 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbef2e1f-1be1-4624-804c-45892231df1e-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.296631 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.79662441 +0000 UTC m=+253.315186259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.360819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-client-ca\") pod \"54942a44-6e66-4757-8106-bbe836a2d8ca\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397302 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54942a44-6e66-4757-8106-bbe836a2d8ca-serving-cert\") pod \"54942a44-6e66-4757-8106-bbe836a2d8ca\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397425 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397448 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-config\") pod \"54942a44-6e66-4757-8106-bbe836a2d8ca\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397548 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46vwj\" (UniqueName: \"kubernetes.io/projected/54942a44-6e66-4757-8106-bbe836a2d8ca-kube-api-access-46vwj\") pod \"54942a44-6e66-4757-8106-bbe836a2d8ca\" (UID: \"54942a44-6e66-4757-8106-bbe836a2d8ca\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397713 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-utilities\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397769 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-catalog-content\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.397808 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqtzd\" (UniqueName: \"kubernetes.io/projected/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-kube-api-access-rqtzd\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.406520 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-client-ca" (OuterVolumeSpecName: "client-ca") pod "54942a44-6e66-4757-8106-bbe836a2d8ca" (UID: "54942a44-6e66-4757-8106-bbe836a2d8ca"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.406958 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-config" (OuterVolumeSpecName: "config") pod "54942a44-6e66-4757-8106-bbe836a2d8ca" (UID: "54942a44-6e66-4757-8106-bbe836a2d8ca"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.407056 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:30.907041392 +0000 UTC m=+253.425603241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.407396 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-utilities\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.407634 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-catalog-content\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.419596 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54942a44-6e66-4757-8106-bbe836a2d8ca-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "54942a44-6e66-4757-8106-bbe836a2d8ca" (UID: "54942a44-6e66-4757-8106-bbe836a2d8ca"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.419756 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54942a44-6e66-4757-8106-bbe836a2d8ca-kube-api-access-46vwj" (OuterVolumeSpecName: "kube-api-access-46vwj") pod "54942a44-6e66-4757-8106-bbe836a2d8ca" (UID: "54942a44-6e66-4757-8106-bbe836a2d8ca"). InnerVolumeSpecName "kube-api-access-46vwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.463159 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.472068 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqtzd\" (UniqueName: \"kubernetes.io/projected/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-kube-api-access-rqtzd\") pod \"community-operators-2v8kx\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.505557 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.505653 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46vwj\" (UniqueName: \"kubernetes.io/projected/54942a44-6e66-4757-8106-bbe836a2d8ca-kube-api-access-46vwj\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.505668 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.505687 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54942a44-6e66-4757-8106-bbe836a2d8ca-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.505697 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54942a44-6e66-4757-8106-bbe836a2d8ca-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.506010 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.005994532 +0000 UTC m=+253.524556381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.556967 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.607049 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.607362 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.107348278 +0000 UTC m=+253.625910127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.659029 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5"] Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.659219 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerName="route-controller-manager" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.659230 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerName="route-controller-manager" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.659314 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" containerName="route-controller-manager" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.661490 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.670877 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.670924 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.688986 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5"] Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.693976 4907 patch_prober.go:28] interesting pod/console-f9d7485db-9lx5z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.694022 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9lx5z" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.700282 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.700331 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.700337 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.700385 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.708948 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.709361 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.20933912 +0000 UTC m=+253.727900969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.721026 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:30 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:30 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:30 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.721073 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.766814 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5"] Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.767122 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-52m6m serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" podUID="afead97a-f5be-4685-913f-e36c4d4c4c62" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.810841 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.811056 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-config\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.811091 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-client-ca\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.811129 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afead97a-f5be-4685-913f-e36c4d4c4c62-serving-cert\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.811177 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52m6m\" (UniqueName: \"kubernetes.io/projected/afead97a-f5be-4685-913f-e36c4d4c4c62-kube-api-access-52m6m\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.811871 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.311857004 +0000 UTC m=+253.830418853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.900611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" event={"ID":"fd06f422-2c09-4da9-843c-75525df52517","Type":"ContainerStarted","Data":"38aafb041446be4a7e3305caf90e0c77c87756facd550e9338c1c8a9a39127fd"} Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.900651 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zsb5l" event={"ID":"fd06f422-2c09-4da9-843c-75525df52517","Type":"ContainerStarted","Data":"276848f12f137120c5ea987d3d07d1065b928a720fef58ec37afc4ea308cfa15"} Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.918471 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-config\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.918505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-client-ca\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.918540 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afead97a-f5be-4685-913f-e36c4d4c4c62-serving-cert\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.918560 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52m6m\" (UniqueName: \"kubernetes.io/projected/afead97a-f5be-4685-913f-e36c4d4c4c62-kube-api-access-52m6m\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.918603 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:30 crc kubenswrapper[4907]: E0226 15:46:30.918890 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.418878024 +0000 UTC m=+253.937439873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.920280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-config\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.920398 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-client-ca\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.930148 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zsb5l" podStartSLOduration=195.930133973 podStartE2EDuration="3m15.930133973s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:30.927841708 +0000 UTC m=+253.446403557" watchObservedRunningTime="2026-02-26 15:46:30.930133973 +0000 UTC m=+253.448695822" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.931743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afead97a-f5be-4685-913f-e36c4d4c4c62-serving-cert\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.962659 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52m6m\" (UniqueName: \"kubernetes.io/projected/afead97a-f5be-4685-913f-e36c4d4c4c62-kube-api-access-52m6m\") pod \"route-controller-manager-598f5c59b5-8dxj5\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:30 crc kubenswrapper[4907]: I0226 15:46:30.973386 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" event={"ID":"d01c15cd-3103-49df-afdd-e6f6d6f35716","Type":"ContainerStarted","Data":"2324da5d2846a53250e9d46b308e5caf698d07b7bdf56bd3b53cc844fe4145ff"} Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:30.998182 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" event={"ID":"54942a44-6e66-4757-8106-bbe836a2d8ca","Type":"ContainerDied","Data":"82fead9ff16b326232a8cf7cec57cfb267f152ad0e74601d2af4a4c7cacd110d"} Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:30.998230 4907 scope.go:117] "RemoveContainer" containerID="d2644c3d16f2880f068ed3473b9f1e9b0826ed05a9392b6c9676ae3feabfd916" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:30.998372 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.021705 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:31 crc kubenswrapper[4907]: E0226 15:46:31.021870 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.52185012 +0000 UTC m=+254.040411969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.022295 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: E0226 15:46:31.022718 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.5227067 +0000 UTC m=+254.041268549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.045774 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" event={"ID":"bbef2e1f-1be1-4624-804c-45892231df1e","Type":"ContainerDied","Data":"cd42c579be0f111294d33e7a5d28454d1b6907a75ca9a4d06696d89d68848920"} Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.045876 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-p9vbb" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.060111 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tqxjz"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.061183 4907 generic.go:334] "Generic (PLEG): container finished" podID="0dd74211-40c2-437c-9295-b69e709f81fe" containerID="dbb0a17c19b0ecd0029d1ab15137ff5e45d1ec47832ed90912f8f2b1f23fb7d1" exitCode=0 Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.061267 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.073647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" event={"ID":"0dd74211-40c2-437c-9295-b69e709f81fe","Type":"ContainerDied","Data":"dbb0a17c19b0ecd0029d1ab15137ff5e45d1ec47832ed90912f8f2b1f23fb7d1"} Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.124120 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:31 crc kubenswrapper[4907]: E0226 15:46:31.124473 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.624446675 +0000 UTC m=+254.143008524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.132013 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: E0226 15:46:31.132503 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.632489707 +0000 UTC m=+254.151051556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.139668 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v8kx"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.147469 4907 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.155309 4907 scope.go:117] "RemoveContainer" containerID="1b0eb3c56ccd014ace15a0c56f6e7ca89dca71b83fe0c0c42006cc66ad1f972c" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.169118 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.171710 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.184755 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-96swm"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.188380 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p9vbb"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.193484 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-p9vbb"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.232933 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52m6m\" (UniqueName: \"kubernetes.io/projected/afead97a-f5be-4685-913f-e36c4d4c4c62-kube-api-access-52m6m\") pod \"afead97a-f5be-4685-913f-e36c4d4c4c62\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.233004 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-config\") pod \"afead97a-f5be-4685-913f-e36c4d4c4c62\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.233032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afead97a-f5be-4685-913f-e36c4d4c4c62-serving-cert\") pod \"afead97a-f5be-4685-913f-e36c4d4c4c62\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.233061 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-client-ca\") pod \"afead97a-f5be-4685-913f-e36c4d4c4c62\" (UID: \"afead97a-f5be-4685-913f-e36c4d4c4c62\") " Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.233182 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.233679 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-config" (OuterVolumeSpecName: "config") pod "afead97a-f5be-4685-913f-e36c4d4c4c62" (UID: "afead97a-f5be-4685-913f-e36c4d4c4c62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.234174 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-client-ca" (OuterVolumeSpecName: "client-ca") pod "afead97a-f5be-4685-913f-e36c4d4c4c62" (UID: "afead97a-f5be-4685-913f-e36c4d4c4c62"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:31 crc kubenswrapper[4907]: E0226 15:46:31.234242 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.734228083 +0000 UTC m=+254.252789932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.236609 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afead97a-f5be-4685-913f-e36c4d4c4c62-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afead97a-f5be-4685-913f-e36c4d4c4c62" (UID: "afead97a-f5be-4685-913f-e36c4d4c4c62"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.239389 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afead97a-f5be-4685-913f-e36c4d4c4c62-kube-api-access-52m6m" (OuterVolumeSpecName: "kube-api-access-52m6m") pod "afead97a-f5be-4685-913f-e36c4d4c4c62" (UID: "afead97a-f5be-4685-913f-e36c4d4c4c62"). InnerVolumeSpecName "kube-api-access-52m6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.288994 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sjflz" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.336939 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.337045 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52m6m\" (UniqueName: \"kubernetes.io/projected/afead97a-f5be-4685-913f-e36c4d4c4c62-kube-api-access-52m6m\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.337059 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.337069 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afead97a-f5be-4685-913f-e36c4d4c4c62-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.337077 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afead97a-f5be-4685-913f-e36c4d4c4c62-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:31 crc kubenswrapper[4907]: E0226 15:46:31.337301 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 15:46:31.837290839 +0000 UTC m=+254.355852688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kqtml" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.348918 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.370405 4907 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-26T15:46:31.147494993Z","Handler":null,"Name":""} Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.393981 4907 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.394013 4907 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.438293 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.439338 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z6m64" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.448071 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.493576 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-22zr8"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.503154 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qhfr7"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.541312 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.567067 4907 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.567105 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.629791 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kqtml\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.657132 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-774c684776-b9h2m"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.658123 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.665188 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.665966 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.666093 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.666286 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.666434 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.670108 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.670539 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.671153 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774c684776-b9h2m"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.708247 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.714121 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:31 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:31 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:31 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.714186 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.744711 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-proxy-ca-bundles\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.744781 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-config\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.744803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bx6\" (UniqueName: \"kubernetes.io/projected/d36a8adb-6b03-4d38-9e3d-28215243982d-kube-api-access-c8bx6\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.744826 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-client-ca\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.744852 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d36a8adb-6b03-4d38-9e3d-28215243982d-serving-cert\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.776921 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fcwbm"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.782550 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.785966 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.790275 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcwbm"] Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.825932 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846216 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-config\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846257 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bx6\" (UniqueName: \"kubernetes.io/projected/d36a8adb-6b03-4d38-9e3d-28215243982d-kube-api-access-c8bx6\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-catalog-content\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-client-ca\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846347 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-utilities\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d36a8adb-6b03-4d38-9e3d-28215243982d-serving-cert\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846416 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-proxy-ca-bundles\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.846466 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkv2t\" (UniqueName: \"kubernetes.io/projected/6c70b66e-978a-4c7e-9892-5579869aa740-kube-api-access-bkv2t\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.847343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-client-ca\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.848142 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-proxy-ca-bundles\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.848723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-config\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.860861 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d36a8adb-6b03-4d38-9e3d-28215243982d-serving-cert\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.864113 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bx6\" (UniqueName: \"kubernetes.io/projected/d36a8adb-6b03-4d38-9e3d-28215243982d-kube-api-access-c8bx6\") pod \"controller-manager-774c684776-b9h2m\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.947826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkv2t\" (UniqueName: \"kubernetes.io/projected/6c70b66e-978a-4c7e-9892-5579869aa740-kube-api-access-bkv2t\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.948146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-catalog-content\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.948185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-utilities\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.949286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-utilities\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.951303 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-catalog-content\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.973053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkv2t\" (UniqueName: \"kubernetes.io/projected/6c70b66e-978a-4c7e-9892-5579869aa740-kube-api-access-bkv2t\") pod \"redhat-marketplace-fcwbm\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:31 crc kubenswrapper[4907]: I0226 15:46:31.991757 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.065032 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kqtml"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.068867 4907 generic.go:334] "Generic (PLEG): container finished" podID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerID="819c58d2ad5fa6ba3642ee404f839db9d4e3eb5e6adb6439fa9d1995cac97649" exitCode=0 Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.068944 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerDied","Data":"819c58d2ad5fa6ba3642ee404f839db9d4e3eb5e6adb6439fa9d1995cac97649"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.068978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerStarted","Data":"9a14fdd18c8fdf23b4449d75fda4a6de3abba541668a78d14925046e7c3a39a7"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.085350 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0e96b15-45f7-47f1-878e-57914ef18916" containerID="37508190b8d35d7607acf4d938f773e568560ddcd0367749779bcc9bc0dd24b1" exitCode=0 Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.085414 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqxjz" event={"ID":"e0e96b15-45f7-47f1-878e-57914ef18916","Type":"ContainerDied","Data":"37508190b8d35d7607acf4d938f773e568560ddcd0367749779bcc9bc0dd24b1"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.086349 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqxjz" event={"ID":"e0e96b15-45f7-47f1-878e-57914ef18916","Type":"ContainerStarted","Data":"f2b308fc94ead912b6e64ba7c506bcc5ba9109de65514b1841ba893f7ccf2ca5"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.090829 4907 generic.go:334] "Generic (PLEG): container finished" podID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerID="d1e260cd6583d39c97d36772d017b1b00c00e0a1e1e45aa09bf0edbe96a62d09" exitCode=0 Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.090886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhfr7" event={"ID":"34138ff4-16e6-4f79-bd8f-0c8cb132ebde","Type":"ContainerDied","Data":"d1e260cd6583d39c97d36772d017b1b00c00e0a1e1e45aa09bf0edbe96a62d09"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.090910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhfr7" event={"ID":"34138ff4-16e6-4f79-bd8f-0c8cb132ebde","Type":"ContainerStarted","Data":"33f5e5572eeaa2f340c830cfb7b0b9c827655147da7e0a111dea508f91d22b9a"} Feb 26 15:46:32 crc kubenswrapper[4907]: W0226 15:46:32.096709 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fefaf3e_d327_41f8_bbbe_94b051a63b19.slice/crio-5e4bedb35215aa589170c696338aa1213956872fc1adf190eeb23b94a8c5bc35 WatchSource:0}: Error finding container 5e4bedb35215aa589170c696338aa1213956872fc1adf190eeb23b94a8c5bc35: Status 404 returned error can't find the container with id 5e4bedb35215aa589170c696338aa1213956872fc1adf190eeb23b94a8c5bc35 Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.098740 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" event={"ID":"d01c15cd-3103-49df-afdd-e6f6d6f35716","Type":"ContainerStarted","Data":"4425f2378aac2545c05b96026e8d960bc291a4cdc567fc58ac8214b315594166"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.098776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" event={"ID":"d01c15cd-3103-49df-afdd-e6f6d6f35716","Type":"ContainerStarted","Data":"4a5dd7416e1289adc023928da9c85d3cfa2af1c3aa4cca431658cc060d343657"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.113836 4907 generic.go:334] "Generic (PLEG): container finished" podID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerID="c2c08520c50b5de1170decee9bdf0e675f941d01781f08e93d921a3eca83bc15" exitCode=0 Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.113908 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.115633 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22zr8" event={"ID":"4d3f9fc7-85b9-4095-af0d-7993e681ab2a","Type":"ContainerDied","Data":"c2c08520c50b5de1170decee9bdf0e675f941d01781f08e93d921a3eca83bc15"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.115671 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22zr8" event={"ID":"4d3f9fc7-85b9-4095-af0d-7993e681ab2a","Type":"ContainerStarted","Data":"fcfcc54b656f7ed5e451008138c11019c142d336aeba7be471de266a08620106"} Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.129954 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.138190 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54942a44-6e66-4757-8106-bbe836a2d8ca" path="/var/lib/kubelet/pods/54942a44-6e66-4757-8106-bbe836a2d8ca/volumes" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.138920 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.139605 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbef2e1f-1be1-4624-804c-45892231df1e" path="/var/lib/kubelet/pods/bbef2e1f-1be1-4624-804c-45892231df1e/volumes" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.152339 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-l5fqj" podStartSLOduration=14.152321052 podStartE2EDuration="14.152321052s" podCreationTimestamp="2026-02-26 15:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:32.151996654 +0000 UTC m=+254.670558503" watchObservedRunningTime="2026-02-26 15:46:32.152321052 +0000 UTC m=+254.670882901" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.174984 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mnd2r"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.182941 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.193417 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnd2r"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.219835 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.223710 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.224254 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.227814 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.228020 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.238482 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.238781 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.238920 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.239753 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.255179 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-598f5c59b5-8dxj5"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.266341 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.266421 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-catalog-content\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.266679 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2p9w\" (UniqueName: \"kubernetes.io/projected/2dc40859-37ff-41ea-88d7-6131b35ceebf-kube-api-access-x2p9w\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.266770 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-utilities\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368768 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-utilities\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368825 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-client-ca\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-config\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368878 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5446r\" (UniqueName: \"kubernetes.io/projected/d2396c40-c93b-4d72-afe1-33e0ae6475c4-kube-api-access-5446r\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-catalog-content\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368932 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2p9w\" (UniqueName: \"kubernetes.io/projected/2dc40859-37ff-41ea-88d7-6131b35ceebf-kube-api-access-x2p9w\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.368949 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2396c40-c93b-4d72-afe1-33e0ae6475c4-serving-cert\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.369222 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-utilities\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.369245 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-catalog-content\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.401575 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2p9w\" (UniqueName: \"kubernetes.io/projected/2dc40859-37ff-41ea-88d7-6131b35ceebf-kube-api-access-x2p9w\") pod \"redhat-marketplace-mnd2r\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.468969 4907 ???:1] "http: TLS handshake error from 192.168.126.11:33960: no serving certificate available for the kubelet" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.474178 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-config\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.474224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5446r\" (UniqueName: \"kubernetes.io/projected/d2396c40-c93b-4d72-afe1-33e0ae6475c4-kube-api-access-5446r\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.474273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2396c40-c93b-4d72-afe1-33e0ae6475c4-serving-cert\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.474310 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-client-ca\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.475083 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-client-ca\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.475810 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-config\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.481193 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2396c40-c93b-4d72-afe1-33e0ae6475c4-serving-cert\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.507434 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5446r\" (UniqueName: \"kubernetes.io/projected/d2396c40-c93b-4d72-afe1-33e0ae6475c4-kube-api-access-5446r\") pod \"route-controller-manager-55fc755786-qqt2m\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.526780 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.589052 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.632132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-774c684776-b9h2m"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.644116 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.644833 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.658890 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.659004 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 26 15:46:32 crc kubenswrapper[4907]: W0226 15:46:32.677357 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd36a8adb_6b03_4d38_9e3d_28215243982d.slice/crio-759778165c9b2293c2623dcd0dd50552aa80d5f831092dca7c45589a2f8949f4 WatchSource:0}: Error finding container 759778165c9b2293c2623dcd0dd50552aa80d5f831092dca7c45589a2f8949f4: Status 404 returned error can't find the container with id 759778165c9b2293c2623dcd0dd50552aa80d5f831092dca7c45589a2f8949f4 Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.705818 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.714306 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:32 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:32 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:32 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.714343 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.775232 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.781964 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dd74211-40c2-437c-9295-b69e709f81fe-config-volume\") pod \"0dd74211-40c2-437c-9295-b69e709f81fe\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.782073 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dd74211-40c2-437c-9295-b69e709f81fe-secret-volume\") pod \"0dd74211-40c2-437c-9295-b69e709f81fe\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.782112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-667f5\" (UniqueName: \"kubernetes.io/projected/0dd74211-40c2-437c-9295-b69e709f81fe-kube-api-access-667f5\") pod \"0dd74211-40c2-437c-9295-b69e709f81fe\" (UID: \"0dd74211-40c2-437c-9295-b69e709f81fe\") " Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.782257 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13490029-c503-4f0c-883a-ced5525774d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.782306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13490029-c503-4f0c-883a-ced5525774d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.783135 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dd74211-40c2-437c-9295-b69e709f81fe-config-volume" (OuterVolumeSpecName: "config-volume") pod "0dd74211-40c2-437c-9295-b69e709f81fe" (UID: "0dd74211-40c2-437c-9295-b69e709f81fe"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.791824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd74211-40c2-437c-9295-b69e709f81fe-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0dd74211-40c2-437c-9295-b69e709f81fe" (UID: "0dd74211-40c2-437c-9295-b69e709f81fe"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.791867 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd74211-40c2-437c-9295-b69e709f81fe-kube-api-access-667f5" (OuterVolumeSpecName: "kube-api-access-667f5") pod "0dd74211-40c2-437c-9295-b69e709f81fe" (UID: "0dd74211-40c2-437c-9295-b69e709f81fe"). InnerVolumeSpecName "kube-api-access-667f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.795922 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-68qpc"] Feb 26 15:46:32 crc kubenswrapper[4907]: E0226 15:46:32.796127 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd74211-40c2-437c-9295-b69e709f81fe" containerName="collect-profiles" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.796142 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd74211-40c2-437c-9295-b69e709f81fe" containerName="collect-profiles" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.796242 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd74211-40c2-437c-9295-b69e709f81fe" containerName="collect-profiles" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.797012 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.800970 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.821104 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68qpc"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.897941 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcwbm"] Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.898611 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13490029-c503-4f0c-883a-ced5525774d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.898802 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-catalog-content\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.898877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nzhg\" (UniqueName: \"kubernetes.io/projected/d6b454c4-bdcd-4904-8564-84c414871c6d-kube-api-access-7nzhg\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.898959 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-utilities\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.899069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13490029-c503-4f0c-883a-ced5525774d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.898653 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13490029-c503-4f0c-883a-ced5525774d2-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.899192 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0dd74211-40c2-437c-9295-b69e709f81fe-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.899322 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0dd74211-40c2-437c-9295-b69e709f81fe-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.899375 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-667f5\" (UniqueName: \"kubernetes.io/projected/0dd74211-40c2-437c-9295-b69e709f81fe-kube-api-access-667f5\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.927466 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13490029-c503-4f0c-883a-ced5525774d2-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:32 crc kubenswrapper[4907]: I0226 15:46:32.997222 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.000020 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-utilities\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.000130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-catalog-content\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.000151 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nzhg\" (UniqueName: \"kubernetes.io/projected/d6b454c4-bdcd-4904-8564-84c414871c6d-kube-api-access-7nzhg\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.000973 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-utilities\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.001038 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-catalog-content\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.033518 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nzhg\" (UniqueName: \"kubernetes.io/projected/d6b454c4-bdcd-4904-8564-84c414871c6d-kube-api-access-7nzhg\") pod \"redhat-operators-68qpc\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.149915 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.164494 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerStarted","Data":"266f70a4b4e3e3430a1da51f67bd8fb99828c7ab0716557a0762d399b723bf7d"} Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.175720 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtqzb"] Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.176920 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.182447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" event={"ID":"0dd74211-40c2-437c-9295-b69e709f81fe","Type":"ContainerDied","Data":"9527d4a12e5d27011aa4fb6b2f87ba832fc3936ed0b9ac0122f3153e6afda18c"} Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.185574 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.198675 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9527d4a12e5d27011aa4fb6b2f87ba832fc3936ed0b9ac0122f3153e6afda18c" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.198734 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.198752 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" event={"ID":"0fefaf3e-d327-41f8-bbbe-94b051a63b19","Type":"ContainerStarted","Data":"fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd"} Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.198772 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" event={"ID":"0fefaf3e-d327-41f8-bbbe-94b051a63b19","Type":"ContainerStarted","Data":"5e4bedb35215aa589170c696338aa1213956872fc1adf190eeb23b94a8c5bc35"} Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.203044 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtqzb"] Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.209948 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" event={"ID":"d36a8adb-6b03-4d38-9e3d-28215243982d","Type":"ContainerStarted","Data":"cdffbc5453b263f0b82ac1c73974202aca16652666a41bfbb42d12ecdfc8dd51"} Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.209982 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" event={"ID":"d36a8adb-6b03-4d38-9e3d-28215243982d","Type":"ContainerStarted","Data":"759778165c9b2293c2623dcd0dd50552aa80d5f831092dca7c45589a2f8949f4"} Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.209994 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.220006 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" podStartSLOduration=198.219989324 podStartE2EDuration="3m18.219989324s" podCreationTimestamp="2026-02-26 15:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:33.217403832 +0000 UTC m=+255.735965681" watchObservedRunningTime="2026-02-26 15:46:33.219989324 +0000 UTC m=+255.738551163" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.220489 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.244945 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" podStartSLOduration=4.244927616 podStartE2EDuration="4.244927616s" podCreationTimestamp="2026-02-26 15:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:33.240483451 +0000 UTC m=+255.759045300" watchObservedRunningTime="2026-02-26 15:46:33.244927616 +0000 UTC m=+255.763489465" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.306383 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-catalog-content\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.306426 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nq7k\" (UniqueName: \"kubernetes.io/projected/8eefa350-bfa6-48dc-9577-692787482b0d-kube-api-access-9nq7k\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.306468 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-utilities\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.365681 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnd2r"] Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.373834 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.374470 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.386286 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.386461 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.408875 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.416586 4907 ???:1] "http: TLS handshake error from 192.168.126.11:33962: no serving certificate available for the kubelet" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.421976 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-catalog-content\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.422047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nq7k\" (UniqueName: \"kubernetes.io/projected/8eefa350-bfa6-48dc-9577-692787482b0d-kube-api-access-9nq7k\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.422104 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-utilities\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.422623 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-utilities\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.422830 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-catalog-content\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.432557 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m"] Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.455079 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nq7k\" (UniqueName: \"kubernetes.io/projected/8eefa350-bfa6-48dc-9577-692787482b0d-kube-api-access-9nq7k\") pod \"redhat-operators-jtqzb\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.503063 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.525740 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9831f77f-a393-45dd-a71f-0287596223e8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.526316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9831f77f-a393-45dd-a71f-0287596223e8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.628419 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9831f77f-a393-45dd-a71f-0287596223e8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.628799 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9831f77f-a393-45dd-a71f-0287596223e8-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.628869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9831f77f-a393-45dd-a71f-0287596223e8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.669470 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9831f77f-a393-45dd-a71f-0287596223e8-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.716095 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.717762 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:33 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:33 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:33 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.717826 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:33 crc kubenswrapper[4907]: I0226 15:46:33.846174 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.013293 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-68qpc"] Feb 26 15:46:34 crc kubenswrapper[4907]: W0226 15:46:34.032879 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b454c4_bdcd_4904_8564_84c414871c6d.slice/crio-5cbe46269cbd05823103c8a0dc8d00b43048842aa45a9e8580a1f5c4a8c568dc WatchSource:0}: Error finding container 5cbe46269cbd05823103c8a0dc8d00b43048842aa45a9e8580a1f5c4a8c568dc: Status 404 returned error can't find the container with id 5cbe46269cbd05823103c8a0dc8d00b43048842aa45a9e8580a1f5c4a8c568dc Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.140906 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afead97a-f5be-4685-913f-e36c4d4c4c62" path="/var/lib/kubelet/pods/afead97a-f5be-4685-913f-e36c4d4c4c62/volumes" Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.191122 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtqzb"] Feb 26 15:46:34 crc kubenswrapper[4907]: W0226 15:46:34.200421 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eefa350_bfa6_48dc_9577_692787482b0d.slice/crio-5bf8fbdce5b911b359573c2625cc1312b81286ca021c7699751bb6b103ed776d WatchSource:0}: Error finding container 5bf8fbdce5b911b359573c2625cc1312b81286ca021c7699751bb6b103ed776d: Status 404 returned error can't find the container with id 5bf8fbdce5b911b359573c2625cc1312b81286ca021c7699751bb6b103ed776d Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.229109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"13490029-c503-4f0c-883a-ced5525774d2","Type":"ContainerStarted","Data":"6d7b05b6f56b794a4eb1deeab0db600ee275853ad1a4d93800cc09175aebb894"} Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.231442 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerStarted","Data":"5cbe46269cbd05823103c8a0dc8d00b43048842aa45a9e8580a1f5c4a8c568dc"} Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.237404 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" event={"ID":"d2396c40-c93b-4d72-afe1-33e0ae6475c4","Type":"ContainerStarted","Data":"05dedd185fecd7592a0a07b8010826b95be0b904d90beacbb168cd6764eb690b"} Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.241361 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c70b66e-978a-4c7e-9892-5579869aa740" containerID="a497956b577958b8fef18a1420d2d621f7ae083a86cdbc6716f46357b1608777" exitCode=0 Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.241409 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerDied","Data":"a497956b577958b8fef18a1420d2d621f7ae083a86cdbc6716f46357b1608777"} Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.245633 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerStarted","Data":"5bf8fbdce5b911b359573c2625cc1312b81286ca021c7699751bb6b103ed776d"} Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.254787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerStarted","Data":"d213f4457a2579dec5c7b3920056759cd35ed19b2e4b7c425c5b36500b1d2818"} Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.419976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 15:46:34 crc kubenswrapper[4907]: W0226 15:46:34.476390 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9831f77f_a393_45dd_a71f_0287596223e8.slice/crio-ef86174fea7cc8746923b59083e7467e4dfa85d6916d43d9300726a9df542ba5 WatchSource:0}: Error finding container ef86174fea7cc8746923b59083e7467e4dfa85d6916d43d9300726a9df542ba5: Status 404 returned error can't find the container with id ef86174fea7cc8746923b59083e7467e4dfa85d6916d43d9300726a9df542ba5 Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.713685 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:34 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:34 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:34 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:34 crc kubenswrapper[4907]: I0226 15:46:34.714010 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.281341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"13490029-c503-4f0c-883a-ced5525774d2","Type":"ContainerStarted","Data":"af106e27a7a393f66b1c604e384bb2d6f58b1cb8ec62abb521eac13869dd9148"} Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.304792 4907 generic.go:334] "Generic (PLEG): container finished" podID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerID="ea695100592dfd1eadf4890e219236bb7912653b40207b5e0dff1b4377913f3c" exitCode=0 Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.304885 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerDied","Data":"ea695100592dfd1eadf4890e219236bb7912653b40207b5e0dff1b4377913f3c"} Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.310048 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" event={"ID":"d2396c40-c93b-4d72-afe1-33e0ae6475c4","Type":"ContainerStarted","Data":"561afe4976a7f68785233a8d2f613299e0fb54e2d9e01ccc28fe2dfe4cc66cd1"} Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.310687 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.312144 4907 generic.go:334] "Generic (PLEG): container finished" podID="8eefa350-bfa6-48dc-9577-692787482b0d" containerID="7cc4bb476a32190ea78b129c131915eacb611b4a303edb9ebc391b79b07fe2a9" exitCode=0 Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.312190 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerDied","Data":"7cc4bb476a32190ea78b129c131915eacb611b4a303edb9ebc391b79b07fe2a9"} Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.316920 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.317835 4907 generic.go:334] "Generic (PLEG): container finished" podID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerID="b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd" exitCode=0 Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.317881 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerDied","Data":"b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd"} Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.323962 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.323946682 podStartE2EDuration="3.323946682s" podCreationTimestamp="2026-02-26 15:46:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:35.293526389 +0000 UTC m=+257.812088228" watchObservedRunningTime="2026-02-26 15:46:35.323946682 +0000 UTC m=+257.842508531" Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.328255 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9831f77f-a393-45dd-a71f-0287596223e8","Type":"ContainerStarted","Data":"ef86174fea7cc8746923b59083e7467e4dfa85d6916d43d9300726a9df542ba5"} Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.409019 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" podStartSLOduration=5.40898389 podStartE2EDuration="5.40898389s" podCreationTimestamp="2026-02-26 15:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:35.402306982 +0000 UTC m=+257.920868831" watchObservedRunningTime="2026-02-26 15:46:35.40898389 +0000 UTC m=+257.927545739" Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.712562 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:35 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:35 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:35 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:35 crc kubenswrapper[4907]: I0226 15:46:35.712628 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.234361 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6g628" Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.254544 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.254524897 podStartE2EDuration="3.254524897s" podCreationTimestamp="2026-02-26 15:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:46:35.465959513 +0000 UTC m=+257.984521362" watchObservedRunningTime="2026-02-26 15:46:36.254524897 +0000 UTC m=+258.773086746" Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.358738 4907 generic.go:334] "Generic (PLEG): container finished" podID="9831f77f-a393-45dd-a71f-0287596223e8" containerID="20bd97a872b48896df89d3c1435829271a4277db39221c251ab4bd308ff9671f" exitCode=0 Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.358850 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9831f77f-a393-45dd-a71f-0287596223e8","Type":"ContainerDied","Data":"20bd97a872b48896df89d3c1435829271a4277db39221c251ab4bd308ff9671f"} Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.376615 4907 generic.go:334] "Generic (PLEG): container finished" podID="13490029-c503-4f0c-883a-ced5525774d2" containerID="af106e27a7a393f66b1c604e384bb2d6f58b1cb8ec62abb521eac13869dd9148" exitCode=0 Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.376666 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"13490029-c503-4f0c-883a-ced5525774d2","Type":"ContainerDied","Data":"af106e27a7a393f66b1c604e384bb2d6f58b1cb8ec62abb521eac13869dd9148"} Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.714111 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:36 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:36 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:36 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:36 crc kubenswrapper[4907]: I0226 15:46:36.714170 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:37 crc kubenswrapper[4907]: I0226 15:46:37.713339 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:37 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:37 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:37 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:37 crc kubenswrapper[4907]: I0226 15:46:37.713388 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:37 crc kubenswrapper[4907]: I0226 15:46:37.994932 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.047919 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.121351 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9831f77f-a393-45dd-a71f-0287596223e8-kube-api-access\") pod \"9831f77f-a393-45dd-a71f-0287596223e8\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.121477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13490029-c503-4f0c-883a-ced5525774d2-kubelet-dir\") pod \"13490029-c503-4f0c-883a-ced5525774d2\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.121612 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13490029-c503-4f0c-883a-ced5525774d2-kube-api-access\") pod \"13490029-c503-4f0c-883a-ced5525774d2\" (UID: \"13490029-c503-4f0c-883a-ced5525774d2\") " Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.121645 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13490029-c503-4f0c-883a-ced5525774d2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "13490029-c503-4f0c-883a-ced5525774d2" (UID: "13490029-c503-4f0c-883a-ced5525774d2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.121741 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9831f77f-a393-45dd-a71f-0287596223e8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9831f77f-a393-45dd-a71f-0287596223e8" (UID: "9831f77f-a393-45dd-a71f-0287596223e8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.121700 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9831f77f-a393-45dd-a71f-0287596223e8-kubelet-dir\") pod \"9831f77f-a393-45dd-a71f-0287596223e8\" (UID: \"9831f77f-a393-45dd-a71f-0287596223e8\") " Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.122892 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9831f77f-a393-45dd-a71f-0287596223e8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.122913 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13490029-c503-4f0c-883a-ced5525774d2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.127125 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13490029-c503-4f0c-883a-ced5525774d2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "13490029-c503-4f0c-883a-ced5525774d2" (UID: "13490029-c503-4f0c-883a-ced5525774d2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.127602 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9831f77f-a393-45dd-a71f-0287596223e8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9831f77f-a393-45dd-a71f-0287596223e8" (UID: "9831f77f-a393-45dd-a71f-0287596223e8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.223829 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13490029-c503-4f0c-883a-ced5525774d2-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.223861 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9831f77f-a393-45dd-a71f-0287596223e8-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.449789 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"13490029-c503-4f0c-883a-ced5525774d2","Type":"ContainerDied","Data":"6d7b05b6f56b794a4eb1deeab0db600ee275853ad1a4d93800cc09175aebb894"} Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.449865 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d7b05b6f56b794a4eb1deeab0db600ee275853ad1a4d93800cc09175aebb894" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.450012 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.469006 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"9831f77f-a393-45dd-a71f-0287596223e8","Type":"ContainerDied","Data":"ef86174fea7cc8746923b59083e7467e4dfa85d6916d43d9300726a9df542ba5"} Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.469041 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef86174fea7cc8746923b59083e7467e4dfa85d6916d43d9300726a9df542ba5" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.469103 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.710982 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:38 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:38 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:38 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:38 crc kubenswrapper[4907]: I0226 15:46:38.711034 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:39 crc kubenswrapper[4907]: I0226 15:46:39.711520 4907 patch_prober.go:28] interesting pod/router-default-5444994796-hqs2t container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 15:46:39 crc kubenswrapper[4907]: [-]has-synced failed: reason withheld Feb 26 15:46:39 crc kubenswrapper[4907]: [+]process-running ok Feb 26 15:46:39 crc kubenswrapper[4907]: healthz check failed Feb 26 15:46:39 crc kubenswrapper[4907]: I0226 15:46:39.711908 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-hqs2t" podUID="def12a12-3cf0-4694-a957-3e69aa18f880" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.669882 4907 patch_prober.go:28] interesting pod/console-f9d7485db-9lx5z container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.669960 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9lx5z" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.696785 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.696818 4907 patch_prober.go:28] interesting pod/downloads-7954f5f757-wcgj6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.696860 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.696873 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wcgj6" podUID="2e969445-2d6b-4ea1-bd4b-3473a66e8c91" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.714044 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:40 crc kubenswrapper[4907]: I0226 15:46:40.717538 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-hqs2t" Feb 26 15:46:42 crc kubenswrapper[4907]: I0226 15:46:42.991370 4907 ???:1] "http: TLS handshake error from 192.168.126.11:34054: no serving certificate available for the kubelet" Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.359322 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774c684776-b9h2m"] Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.364470 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerName="controller-manager" containerID="cri-o://cdffbc5453b263f0b82ac1c73974202aca16652666a41bfbb42d12ecdfc8dd51" gracePeriod=30 Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.377083 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m"] Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.377349 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerName="route-controller-manager" containerID="cri-o://561afe4976a7f68785233a8d2f613299e0fb54e2d9e01ccc28fe2dfe4cc66cd1" gracePeriod=30 Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.536145 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.536211 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.974204 4907 generic.go:334] "Generic (PLEG): container finished" podID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerID="cdffbc5453b263f0b82ac1c73974202aca16652666a41bfbb42d12ecdfc8dd51" exitCode=0 Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.974281 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" event={"ID":"d36a8adb-6b03-4d38-9e3d-28215243982d","Type":"ContainerDied","Data":"cdffbc5453b263f0b82ac1c73974202aca16652666a41bfbb42d12ecdfc8dd51"} Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.976070 4907 generic.go:334] "Generic (PLEG): container finished" podID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerID="561afe4976a7f68785233a8d2f613299e0fb54e2d9e01ccc28fe2dfe4cc66cd1" exitCode=0 Feb 26 15:46:48 crc kubenswrapper[4907]: I0226 15:46:48.976095 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" event={"ID":"d2396c40-c93b-4d72-afe1-33e0ae6475c4","Type":"ContainerDied","Data":"561afe4976a7f68785233a8d2f613299e0fb54e2d9e01ccc28fe2dfe4cc66cd1"} Feb 26 15:46:50 crc kubenswrapper[4907]: I0226 15:46:50.673277 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:50 crc kubenswrapper[4907]: I0226 15:46:50.678542 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:46:50 crc kubenswrapper[4907]: I0226 15:46:50.703469 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wcgj6" Feb 26 15:46:51 crc kubenswrapper[4907]: I0226 15:46:51.832198 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:46:51 crc kubenswrapper[4907]: I0226 15:46:51.993196 4907 patch_prober.go:28] interesting pod/controller-manager-774c684776-b9h2m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Feb 26 15:46:51 crc kubenswrapper[4907]: I0226 15:46:51.993466 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Feb 26 15:46:52 crc kubenswrapper[4907]: I0226 15:46:52.590553 4907 patch_prober.go:28] interesting pod/route-controller-manager-55fc755786-qqt2m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Feb 26 15:46:52 crc kubenswrapper[4907]: I0226 15:46:52.590625 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Feb 26 15:47:01 crc kubenswrapper[4907]: I0226 15:47:01.444318 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rlmpn" Feb 26 15:47:01 crc kubenswrapper[4907]: I0226 15:47:01.992951 4907 patch_prober.go:28] interesting pod/controller-manager-774c684776-b9h2m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Feb 26 15:47:01 crc kubenswrapper[4907]: I0226 15:47:01.993038 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Feb 26 15:47:02 crc kubenswrapper[4907]: I0226 15:47:02.590008 4907 patch_prober.go:28] interesting pod/route-controller-manager-55fc755786-qqt2m container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Feb 26 15:47:02 crc kubenswrapper[4907]: I0226 15:47:02.590080 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.351650 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 15:47:04 crc kubenswrapper[4907]: E0226 15:47:04.352108 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9831f77f-a393-45dd-a71f-0287596223e8" containerName="pruner" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.352138 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9831f77f-a393-45dd-a71f-0287596223e8" containerName="pruner" Feb 26 15:47:04 crc kubenswrapper[4907]: E0226 15:47:04.352171 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13490029-c503-4f0c-883a-ced5525774d2" containerName="pruner" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.352186 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="13490029-c503-4f0c-883a-ced5525774d2" containerName="pruner" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.352429 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="13490029-c503-4f0c-883a-ced5525774d2" containerName="pruner" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.352454 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9831f77f-a393-45dd-a71f-0287596223e8" containerName="pruner" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.353220 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.363437 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.363690 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.365407 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.466947 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f1053f-25e8-4308-b15e-ca530e8118ab-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.467056 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60f1053f-25e8-4308-b15e-ca530e8118ab-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.568781 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60f1053f-25e8-4308-b15e-ca530e8118ab-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.568925 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f1053f-25e8-4308-b15e-ca530e8118ab-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.568971 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60f1053f-25e8-4308-b15e-ca530e8118ab-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:04 crc kubenswrapper[4907]: I0226 15:47:04.602045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f1053f-25e8-4308-b15e-ca530e8118ab-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:07 crc kubenswrapper[4907]: I0226 15:47:04.684046 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:07 crc kubenswrapper[4907]: E0226 15:47:05.091390 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 15:47:07 crc kubenswrapper[4907]: E0226 15:47:05.091612 4907 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:47:07 crc kubenswrapper[4907]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 15:47:07 crc kubenswrapper[4907]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rnmft,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535346-hhrww_openshift-infra(c6986b68-4a8d-4677-bed1-493eb1a231c3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 15:47:07 crc kubenswrapper[4907]: > logger="UnhandledError" Feb 26 15:47:07 crc kubenswrapper[4907]: E0226 15:47:05.092957 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535346-hhrww" podUID="c6986b68-4a8d-4677-bed1-493eb1a231c3" Feb 26 15:47:07 crc kubenswrapper[4907]: E0226 15:47:06.069645 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535346-hhrww" podUID="c6986b68-4a8d-4677-bed1-493eb1a231c3" Feb 26 15:47:07 crc kubenswrapper[4907]: I0226 15:47:06.481131 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 15:47:09 crc kubenswrapper[4907]: E0226 15:47:09.035502 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 15:47:09 crc kubenswrapper[4907]: E0226 15:47:09.036075 4907 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 15:47:09 crc kubenswrapper[4907]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 15:47:09 crc kubenswrapper[4907]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhbhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535344-fsndq_openshift-infra(1b0532e1-9350-435d-bb1f-72bb0931a2e8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 15:47:09 crc kubenswrapper[4907]: > logger="UnhandledError" Feb 26 15:47:09 crc kubenswrapper[4907]: E0226 15:47:09.037372 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535344-fsndq" podUID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" Feb 26 15:47:09 crc kubenswrapper[4907]: E0226 15:47:09.092104 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535344-fsndq" podUID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.134455 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.138044 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.163399 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.237019 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc73ba8-89c5-4844-a81e-742468c4366c-kube-api-access\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.237126 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-var-lock\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.237157 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.314759 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.321499 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.338376 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc73ba8-89c5-4844-a81e-742468c4366c-kube-api-access\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.338445 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-var-lock\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.338483 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.338648 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.338829 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-var-lock\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.387745 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc73ba8-89c5-4844-a81e-742468c4366c-kube-api-access\") pod \"installer-9-crc\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440016 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-client-ca\") pod \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440076 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-proxy-ca-bundles\") pod \"d36a8adb-6b03-4d38-9e3d-28215243982d\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440144 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d36a8adb-6b03-4d38-9e3d-28215243982d-serving-cert\") pod \"d36a8adb-6b03-4d38-9e3d-28215243982d\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440725 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-client-ca\") pod \"d36a8adb-6b03-4d38-9e3d-28215243982d\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440758 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-config\") pod \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440800 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5446r\" (UniqueName: \"kubernetes.io/projected/d2396c40-c93b-4d72-afe1-33e0ae6475c4-kube-api-access-5446r\") pod \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-config\") pod \"d36a8adb-6b03-4d38-9e3d-28215243982d\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440867 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2396c40-c93b-4d72-afe1-33e0ae6475c4-serving-cert\") pod \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\" (UID: \"d2396c40-c93b-4d72-afe1-33e0ae6475c4\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.440937 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8bx6\" (UniqueName: \"kubernetes.io/projected/d36a8adb-6b03-4d38-9e3d-28215243982d-kube-api-access-c8bx6\") pod \"d36a8adb-6b03-4d38-9e3d-28215243982d\" (UID: \"d36a8adb-6b03-4d38-9e3d-28215243982d\") " Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.441053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-client-ca" (OuterVolumeSpecName: "client-ca") pod "d2396c40-c93b-4d72-afe1-33e0ae6475c4" (UID: "d2396c40-c93b-4d72-afe1-33e0ae6475c4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.441864 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.444889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-config" (OuterVolumeSpecName: "config") pod "d2396c40-c93b-4d72-afe1-33e0ae6475c4" (UID: "d2396c40-c93b-4d72-afe1-33e0ae6475c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.446024 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-config" (OuterVolumeSpecName: "config") pod "d36a8adb-6b03-4d38-9e3d-28215243982d" (UID: "d36a8adb-6b03-4d38-9e3d-28215243982d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.446500 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2396c40-c93b-4d72-afe1-33e0ae6475c4-kube-api-access-5446r" (OuterVolumeSpecName: "kube-api-access-5446r") pod "d2396c40-c93b-4d72-afe1-33e0ae6475c4" (UID: "d2396c40-c93b-4d72-afe1-33e0ae6475c4"). InnerVolumeSpecName "kube-api-access-5446r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.448819 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2396c40-c93b-4d72-afe1-33e0ae6475c4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d2396c40-c93b-4d72-afe1-33e0ae6475c4" (UID: "d2396c40-c93b-4d72-afe1-33e0ae6475c4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.452210 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-client-ca" (OuterVolumeSpecName: "client-ca") pod "d36a8adb-6b03-4d38-9e3d-28215243982d" (UID: "d36a8adb-6b03-4d38-9e3d-28215243982d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.452460 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d36a8adb-6b03-4d38-9e3d-28215243982d" (UID: "d36a8adb-6b03-4d38-9e3d-28215243982d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.452672 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d36a8adb-6b03-4d38-9e3d-28215243982d-kube-api-access-c8bx6" (OuterVolumeSpecName: "kube-api-access-c8bx6") pod "d36a8adb-6b03-4d38-9e3d-28215243982d" (UID: "d36a8adb-6b03-4d38-9e3d-28215243982d"). InnerVolumeSpecName "kube-api-access-c8bx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.454698 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d36a8adb-6b03-4d38-9e3d-28215243982d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d36a8adb-6b03-4d38-9e3d-28215243982d" (UID: "d36a8adb-6b03-4d38-9e3d-28215243982d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.471115 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.543356 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.543797 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2396c40-c93b-4d72-afe1-33e0ae6475c4-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.543877 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5446r\" (UniqueName: \"kubernetes.io/projected/d2396c40-c93b-4d72-afe1-33e0ae6475c4-kube-api-access-5446r\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.543942 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.543998 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d2396c40-c93b-4d72-afe1-33e0ae6475c4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.544077 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8bx6\" (UniqueName: \"kubernetes.io/projected/d36a8adb-6b03-4d38-9e3d-28215243982d-kube-api-access-c8bx6\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.544160 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d36a8adb-6b03-4d38-9e3d-28215243982d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.544243 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d36a8adb-6b03-4d38-9e3d-28215243982d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.715931 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr"] Feb 26 15:47:09 crc kubenswrapper[4907]: E0226 15:47:09.716891 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerName="controller-manager" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.716980 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerName="controller-manager" Feb 26 15:47:09 crc kubenswrapper[4907]: E0226 15:47:09.717056 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerName="route-controller-manager" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.717132 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerName="route-controller-manager" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.717299 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" containerName="route-controller-manager" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.717366 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" containerName="controller-manager" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.717931 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.728868 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66ff5595c7-g629l"] Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.730079 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.732104 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66ff5595c7-g629l"] Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.736399 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr"] Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-config\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-config\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850377 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-client-ca\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-proxy-ca-bundles\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850435 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-serving-cert\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850556 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dztdz\" (UniqueName: \"kubernetes.io/projected/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-kube-api-access-dztdz\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47b6d\" (UniqueName: \"kubernetes.io/projected/48b4caaa-95bc-41de-9716-baf47a347bfa-kube-api-access-47b6d\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850715 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4caaa-95bc-41de-9716-baf47a347bfa-serving-cert\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.850737 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-client-ca\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952408 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-proxy-ca-bundles\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952451 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-serving-cert\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dztdz\" (UniqueName: \"kubernetes.io/projected/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-kube-api-access-dztdz\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47b6d\" (UniqueName: \"kubernetes.io/projected/48b4caaa-95bc-41de-9716-baf47a347bfa-kube-api-access-47b6d\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952527 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-client-ca\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952545 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4caaa-95bc-41de-9716-baf47a347bfa-serving-cert\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-config\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952643 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-config\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.952670 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-client-ca\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.954135 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-config\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.954129 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-proxy-ca-bundles\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.955087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-client-ca\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.955786 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-client-ca\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.957499 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4caaa-95bc-41de-9716-baf47a347bfa-serving-cert\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.957604 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-serving-cert\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.958206 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-config\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.969982 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dztdz\" (UniqueName: \"kubernetes.io/projected/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-kube-api-access-dztdz\") pod \"route-controller-manager-f5d65b5cf-tmtfr\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:09 crc kubenswrapper[4907]: I0226 15:47:09.970923 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47b6d\" (UniqueName: \"kubernetes.io/projected/48b4caaa-95bc-41de-9716-baf47a347bfa-kube-api-access-47b6d\") pod \"controller-manager-66ff5595c7-g629l\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.041312 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.054548 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.097908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" event={"ID":"d2396c40-c93b-4d72-afe1-33e0ae6475c4","Type":"ContainerDied","Data":"05dedd185fecd7592a0a07b8010826b95be0b904d90beacbb168cd6764eb690b"} Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.097962 4907 scope.go:117] "RemoveContainer" containerID="561afe4976a7f68785233a8d2f613299e0fb54e2d9e01ccc28fe2dfe4cc66cd1" Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.098099 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m" Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.104532 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" event={"ID":"d36a8adb-6b03-4d38-9e3d-28215243982d","Type":"ContainerDied","Data":"759778165c9b2293c2623dcd0dd50552aa80d5f831092dca7c45589a2f8949f4"} Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.104570 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-774c684776-b9h2m" Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.154430 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m"] Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.154498 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55fc755786-qqt2m"] Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.154517 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-774c684776-b9h2m"] Feb 26 15:47:10 crc kubenswrapper[4907]: I0226 15:47:10.154531 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-774c684776-b9h2m"] Feb 26 15:47:12 crc kubenswrapper[4907]: I0226 15:47:12.144761 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2396c40-c93b-4d72-afe1-33e0ae6475c4" path="/var/lib/kubelet/pods/d2396c40-c93b-4d72-afe1-33e0ae6475c4/volumes" Feb 26 15:47:12 crc kubenswrapper[4907]: I0226 15:47:12.148798 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d36a8adb-6b03-4d38-9e3d-28215243982d" path="/var/lib/kubelet/pods/d36a8adb-6b03-4d38-9e3d-28215243982d/volumes" Feb 26 15:47:16 crc kubenswrapper[4907]: E0226 15:47:16.566529 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 15:47:16 crc kubenswrapper[4907]: E0226 15:47:16.566982 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9nq7k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jtqzb_openshift-marketplace(8eefa350-bfa6-48dc-9577-692787482b0d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:16 crc kubenswrapper[4907]: E0226 15:47:16.568323 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jtqzb" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" Feb 26 15:47:17 crc kubenswrapper[4907]: E0226 15:47:17.882362 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jtqzb" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" Feb 26 15:47:17 crc kubenswrapper[4907]: E0226 15:47:17.964695 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 15:47:17 crc kubenswrapper[4907]: E0226 15:47:17.964857 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2p9w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mnd2r_openshift-marketplace(2dc40859-37ff-41ea-88d7-6131b35ceebf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:17 crc kubenswrapper[4907]: E0226 15:47:17.966009 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mnd2r" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" Feb 26 15:47:18 crc kubenswrapper[4907]: I0226 15:47:18.530083 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:47:18 crc kubenswrapper[4907]: I0226 15:47:18.530297 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:47:18 crc kubenswrapper[4907]: I0226 15:47:18.530336 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:47:18 crc kubenswrapper[4907]: I0226 15:47:18.530712 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:47:18 crc kubenswrapper[4907]: I0226 15:47:18.530759 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee" gracePeriod=600 Feb 26 15:47:19 crc kubenswrapper[4907]: I0226 15:47:19.154378 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee" exitCode=0 Feb 26 15:47:19 crc kubenswrapper[4907]: I0226 15:47:19.154418 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee"} Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.878997 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mnd2r" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.941653 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.941777 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xmcgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tqxjz_openshift-marketplace(e0e96b15-45f7-47f1-878e-57914ef18916): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.943059 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tqxjz" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.960354 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.960488 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bkv2t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-fcwbm_openshift-marketplace(6c70b66e-978a-4c7e-9892-5579869aa740): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:19 crc kubenswrapper[4907]: E0226 15:47:19.963416 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-fcwbm" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" Feb 26 15:47:20 crc kubenswrapper[4907]: E0226 15:47:20.007723 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 15:47:20 crc kubenswrapper[4907]: E0226 15:47:20.007869 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j4g5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-qhfr7_openshift-marketplace(34138ff4-16e6-4f79-bd8f-0c8cb132ebde): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:20 crc kubenswrapper[4907]: E0226 15:47:20.009394 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-qhfr7" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" Feb 26 15:47:22 crc kubenswrapper[4907]: E0226 15:47:22.229741 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-qhfr7" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" Feb 26 15:47:22 crc kubenswrapper[4907]: E0226 15:47:22.229764 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tqxjz" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" Feb 26 15:47:22 crc kubenswrapper[4907]: E0226 15:47:22.230366 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-fcwbm" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" Feb 26 15:47:22 crc kubenswrapper[4907]: I0226 15:47:22.236718 4907 scope.go:117] "RemoveContainer" containerID="cdffbc5453b263f0b82ac1c73974202aca16652666a41bfbb42d12ecdfc8dd51" Feb 26 15:47:22 crc kubenswrapper[4907]: I0226 15:47:22.599445 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 15:47:22 crc kubenswrapper[4907]: I0226 15:47:22.701210 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 15:47:22 crc kubenswrapper[4907]: I0226 15:47:22.713505 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr"] Feb 26 15:47:22 crc kubenswrapper[4907]: W0226 15:47:22.724770 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dcc24b7_fc12_45ae_ae4c_3f5f9579c2b5.slice/crio-4a86787ff3194ba3268101e9f267edcf222a996d42471fcd0eb0361dcb7b1e5f WatchSource:0}: Error finding container 4a86787ff3194ba3268101e9f267edcf222a996d42471fcd0eb0361dcb7b1e5f: Status 404 returned error can't find the container with id 4a86787ff3194ba3268101e9f267edcf222a996d42471fcd0eb0361dcb7b1e5f Feb 26 15:47:22 crc kubenswrapper[4907]: I0226 15:47:22.734455 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66ff5595c7-g629l"] Feb 26 15:47:22 crc kubenswrapper[4907]: E0226 15:47:22.927203 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 15:47:22 crc kubenswrapper[4907]: E0226 15:47:22.927364 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqtzd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2v8kx_openshift-marketplace(763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:22 crc kubenswrapper[4907]: E0226 15:47:22.928576 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2v8kx" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.176807 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" event={"ID":"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5","Type":"ContainerStarted","Data":"8d96943391f455bcf54423a6b5dcb5e6f40042b26c6f185fcc13fef35b5f5ad0"} Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.177183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" event={"ID":"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5","Type":"ContainerStarted","Data":"4a86787ff3194ba3268101e9f267edcf222a996d42471fcd0eb0361dcb7b1e5f"} Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.179053 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"abc73ba8-89c5-4844-a81e-742468c4366c","Type":"ContainerStarted","Data":"7cdc24871a7d9475407aaff2ed547ec8ab5b2de563852e2e7155029c78a8df0c"} Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.181170 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" event={"ID":"48b4caaa-95bc-41de-9716-baf47a347bfa","Type":"ContainerStarted","Data":"1fc44381ced63d986d8e42ef89d07c4bf7d86842151dcccb8283d4c25f5f5cf2"} Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.182133 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"60f1053f-25e8-4308-b15e-ca530e8118ab","Type":"ContainerStarted","Data":"218128a76cfbc451706f85b637efef6fe0e137b9e21f2864067a2bba0c9ef96e"} Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.185086 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"5b5fce09e6f67f86221daea08fdd5259aaa4024d9dbe5e7a76056c4c092f3ec2"} Feb 26 15:47:23 crc kubenswrapper[4907]: E0226 15:47:23.186333 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2v8kx" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" Feb 26 15:47:23 crc kubenswrapper[4907]: E0226 15:47:23.770909 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 15:47:23 crc kubenswrapper[4907]: E0226 15:47:23.771053 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7nzhg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-68qpc_openshift-marketplace(d6b454c4-bdcd-4904-8564-84c414871c6d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:23 crc kubenswrapper[4907]: E0226 15:47:23.772352 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-68qpc" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" Feb 26 15:47:23 crc kubenswrapper[4907]: I0226 15:47:23.974549 4907 ???:1] "http: TLS handshake error from 192.168.126.11:46078: no serving certificate available for the kubelet" Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.216044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"60f1053f-25e8-4308-b15e-ca530e8118ab","Type":"ContainerStarted","Data":"b84906803c15eb222fee2e691c4374fda1e507f4589bf41903da5255818c2765"} Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.217125 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"abc73ba8-89c5-4844-a81e-742468c4366c","Type":"ContainerStarted","Data":"61366141be31e3250da68ac97435813984e0e5f56c778448271c955a8e8ad5b1"} Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.219011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" event={"ID":"48b4caaa-95bc-41de-9716-baf47a347bfa","Type":"ContainerStarted","Data":"b51cfbdad0fb09157d6f232abf6c434d15cefcaa6c7a93e930c0f2b41e5897d0"} Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.219337 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:24 crc kubenswrapper[4907]: E0226 15:47:24.227806 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-68qpc" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.228833 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.255468 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=20.25545077 podStartE2EDuration="20.25545077s" podCreationTimestamp="2026-02-26 15:47:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:24.236573123 +0000 UTC m=+306.755134972" watchObservedRunningTime="2026-02-26 15:47:24.25545077 +0000 UTC m=+306.774012619" Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.340637 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=15.340618803 podStartE2EDuration="15.340618803s" podCreationTimestamp="2026-02-26 15:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:24.310226321 +0000 UTC m=+306.828788170" watchObservedRunningTime="2026-02-26 15:47:24.340618803 +0000 UTC m=+306.859180652" Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.340845 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" podStartSLOduration=16.340842048 podStartE2EDuration="16.340842048s" podCreationTimestamp="2026-02-26 15:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:24.339744862 +0000 UTC m=+306.858306711" watchObservedRunningTime="2026-02-26 15:47:24.340842048 +0000 UTC m=+306.859403897" Feb 26 15:47:24 crc kubenswrapper[4907]: I0226 15:47:24.383506 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" podStartSLOduration=16.383487541 podStartE2EDuration="16.383487541s" podCreationTimestamp="2026-02-26 15:47:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:24.358786454 +0000 UTC m=+306.877348303" watchObservedRunningTime="2026-02-26 15:47:24.383487541 +0000 UTC m=+306.902049390" Feb 26 15:47:24 crc kubenswrapper[4907]: E0226 15:47:24.707782 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 26 15:47:24 crc kubenswrapper[4907]: E0226 15:47:24.707952 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8pwjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-22zr8_openshift-marketplace(4d3f9fc7-85b9-4095-af0d-7993e681ab2a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 15:47:24 crc kubenswrapper[4907]: E0226 15:47:24.710085 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-22zr8" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.194659 4907 csr.go:261] certificate signing request csr-wbrnr is approved, waiting to be issued Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.201453 4907 csr.go:257] certificate signing request csr-wbrnr is issued Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.226188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-fsndq" event={"ID":"1b0532e1-9350-435d-bb1f-72bb0931a2e8","Type":"ContainerStarted","Data":"5d03c4417c6bd60e984baf42dd9736b039ec92570e55f16ae134bc81394e032d"} Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.229507 4907 generic.go:334] "Generic (PLEG): container finished" podID="60f1053f-25e8-4308-b15e-ca530e8118ab" containerID="b84906803c15eb222fee2e691c4374fda1e507f4589bf41903da5255818c2765" exitCode=0 Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.229628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"60f1053f-25e8-4308-b15e-ca530e8118ab","Type":"ContainerDied","Data":"b84906803c15eb222fee2e691c4374fda1e507f4589bf41903da5255818c2765"} Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.233280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-hhrww" event={"ID":"c6986b68-4a8d-4677-bed1-493eb1a231c3","Type":"ContainerStarted","Data":"85ab84dfe988254bcbc6f434e907e2b18a8672cf99d2da69c404eba2f5afbaf8"} Feb 26 15:47:25 crc kubenswrapper[4907]: E0226 15:47:25.235116 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-22zr8" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" Feb 26 15:47:25 crc kubenswrapper[4907]: I0226 15:47:25.244438 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535344-fsndq" podStartSLOduration=145.078708563 podStartE2EDuration="3m25.244416064s" podCreationTimestamp="2026-02-26 15:44:00 +0000 UTC" firstStartedPulling="2026-02-26 15:46:24.412743269 +0000 UTC m=+246.931305118" lastFinishedPulling="2026-02-26 15:47:24.57845077 +0000 UTC m=+307.097012619" observedRunningTime="2026-02-26 15:47:25.242090008 +0000 UTC m=+307.760651857" watchObservedRunningTime="2026-02-26 15:47:25.244416064 +0000 UTC m=+307.762977913" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.203396 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 21:50:21.113140052 +0000 UTC Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.203451 4907 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6702h2m54.909694257s for next certificate rotation Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.239296 4907 generic.go:334] "Generic (PLEG): container finished" podID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" containerID="5d03c4417c6bd60e984baf42dd9736b039ec92570e55f16ae134bc81394e032d" exitCode=0 Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.239357 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-fsndq" event={"ID":"1b0532e1-9350-435d-bb1f-72bb0931a2e8","Type":"ContainerDied","Data":"5d03c4417c6bd60e984baf42dd9736b039ec92570e55f16ae134bc81394e032d"} Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.241135 4907 generic.go:334] "Generic (PLEG): container finished" podID="c6986b68-4a8d-4677-bed1-493eb1a231c3" containerID="85ab84dfe988254bcbc6f434e907e2b18a8672cf99d2da69c404eba2f5afbaf8" exitCode=0 Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.241232 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-hhrww" event={"ID":"c6986b68-4a8d-4677-bed1-493eb1a231c3","Type":"ContainerDied","Data":"85ab84dfe988254bcbc6f434e907e2b18a8672cf99d2da69c404eba2f5afbaf8"} Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.496851 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.561465 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnmft\" (UniqueName: \"kubernetes.io/projected/c6986b68-4a8d-4677-bed1-493eb1a231c3-kube-api-access-rnmft\") pod \"c6986b68-4a8d-4677-bed1-493eb1a231c3\" (UID: \"c6986b68-4a8d-4677-bed1-493eb1a231c3\") " Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.566301 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6986b68-4a8d-4677-bed1-493eb1a231c3-kube-api-access-rnmft" (OuterVolumeSpecName: "kube-api-access-rnmft") pod "c6986b68-4a8d-4677-bed1-493eb1a231c3" (UID: "c6986b68-4a8d-4677-bed1-493eb1a231c3"). InnerVolumeSpecName "kube-api-access-rnmft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.598845 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.662286 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f1053f-25e8-4308-b15e-ca530e8118ab-kube-api-access\") pod \"60f1053f-25e8-4308-b15e-ca530e8118ab\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.662394 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60f1053f-25e8-4308-b15e-ca530e8118ab-kubelet-dir\") pod \"60f1053f-25e8-4308-b15e-ca530e8118ab\" (UID: \"60f1053f-25e8-4308-b15e-ca530e8118ab\") " Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.662573 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnmft\" (UniqueName: \"kubernetes.io/projected/c6986b68-4a8d-4677-bed1-493eb1a231c3-kube-api-access-rnmft\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.662645 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60f1053f-25e8-4308-b15e-ca530e8118ab-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "60f1053f-25e8-4308-b15e-ca530e8118ab" (UID: "60f1053f-25e8-4308-b15e-ca530e8118ab"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.665643 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f1053f-25e8-4308-b15e-ca530e8118ab-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "60f1053f-25e8-4308-b15e-ca530e8118ab" (UID: "60f1053f-25e8-4308-b15e-ca530e8118ab"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.764120 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60f1053f-25e8-4308-b15e-ca530e8118ab-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:26 crc kubenswrapper[4907]: I0226 15:47:26.764146 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/60f1053f-25e8-4308-b15e-ca530e8118ab-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.248220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"60f1053f-25e8-4308-b15e-ca530e8118ab","Type":"ContainerDied","Data":"218128a76cfbc451706f85b637efef6fe0e137b9e21f2864067a2bba0c9ef96e"} Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.248446 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="218128a76cfbc451706f85b637efef6fe0e137b9e21f2864067a2bba0c9ef96e" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.248259 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.257329 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535346-hhrww" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.257709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535346-hhrww" event={"ID":"c6986b68-4a8d-4677-bed1-493eb1a231c3","Type":"ContainerDied","Data":"f84d2a2512a848bb2ee926d0b7687b3477982a665dff70d90763444b7b73d1ea"} Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.257843 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f84d2a2512a848bb2ee926d0b7687b3477982a665dff70d90763444b7b73d1ea" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.566674 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.674372 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhbhg\" (UniqueName: \"kubernetes.io/projected/1b0532e1-9350-435d-bb1f-72bb0931a2e8-kube-api-access-bhbhg\") pod \"1b0532e1-9350-435d-bb1f-72bb0931a2e8\" (UID: \"1b0532e1-9350-435d-bb1f-72bb0931a2e8\") " Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.684824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b0532e1-9350-435d-bb1f-72bb0931a2e8-kube-api-access-bhbhg" (OuterVolumeSpecName: "kube-api-access-bhbhg") pod "1b0532e1-9350-435d-bb1f-72bb0931a2e8" (UID: "1b0532e1-9350-435d-bb1f-72bb0931a2e8"). InnerVolumeSpecName "kube-api-access-bhbhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:27 crc kubenswrapper[4907]: I0226 15:47:27.776088 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhbhg\" (UniqueName: \"kubernetes.io/projected/1b0532e1-9350-435d-bb1f-72bb0931a2e8-kube-api-access-bhbhg\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:28 crc kubenswrapper[4907]: I0226 15:47:28.263854 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535344-fsndq" event={"ID":"1b0532e1-9350-435d-bb1f-72bb0931a2e8","Type":"ContainerDied","Data":"047527ef54d878e044c11075aca9d5dfdd97144eeef7800d6cc1101ee73d4379"} Feb 26 15:47:28 crc kubenswrapper[4907]: I0226 15:47:28.263895 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="047527ef54d878e044c11075aca9d5dfdd97144eeef7800d6cc1101ee73d4379" Feb 26 15:47:28 crc kubenswrapper[4907]: I0226 15:47:28.263916 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535344-fsndq" Feb 26 15:47:30 crc kubenswrapper[4907]: I0226 15:47:30.042817 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:30 crc kubenswrapper[4907]: I0226 15:47:30.047511 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:34 crc kubenswrapper[4907]: I0226 15:47:34.295868 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerStarted","Data":"11c032adc51a3bf30ce404d20de56bd3c614dde407a0177627e4b9ede529c291"} Feb 26 15:47:35 crc kubenswrapper[4907]: I0226 15:47:35.302505 4907 generic.go:334] "Generic (PLEG): container finished" podID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerID="33edb84189c94eb903164e1a7747c2c77ba27f9767060389f1218b24ead39322" exitCode=0 Feb 26 15:47:35 crc kubenswrapper[4907]: I0226 15:47:35.302607 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhfr7" event={"ID":"34138ff4-16e6-4f79-bd8f-0c8cb132ebde","Type":"ContainerDied","Data":"33edb84189c94eb903164e1a7747c2c77ba27f9767060389f1218b24ead39322"} Feb 26 15:47:35 crc kubenswrapper[4907]: I0226 15:47:35.305033 4907 generic.go:334] "Generic (PLEG): container finished" podID="8eefa350-bfa6-48dc-9577-692787482b0d" containerID="11c032adc51a3bf30ce404d20de56bd3c614dde407a0177627e4b9ede529c291" exitCode=0 Feb 26 15:47:35 crc kubenswrapper[4907]: I0226 15:47:35.305061 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerDied","Data":"11c032adc51a3bf30ce404d20de56bd3c614dde407a0177627e4b9ede529c291"} Feb 26 15:47:36 crc kubenswrapper[4907]: I0226 15:47:36.312692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerStarted","Data":"9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d"} Feb 26 15:47:36 crc kubenswrapper[4907]: I0226 15:47:36.315703 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerStarted","Data":"0f48aaaabc782b274056eec753def33f5ba9dbd594bd7b6f158793c163222e37"} Feb 26 15:47:37 crc kubenswrapper[4907]: I0226 15:47:37.322803 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c70b66e-978a-4c7e-9892-5579869aa740" containerID="0f48aaaabc782b274056eec753def33f5ba9dbd594bd7b6f158793c163222e37" exitCode=0 Feb 26 15:47:37 crc kubenswrapper[4907]: I0226 15:47:37.322848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerDied","Data":"0f48aaaabc782b274056eec753def33f5ba9dbd594bd7b6f158793c163222e37"} Feb 26 15:47:37 crc kubenswrapper[4907]: I0226 15:47:37.325691 4907 generic.go:334] "Generic (PLEG): container finished" podID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerID="9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d" exitCode=0 Feb 26 15:47:37 crc kubenswrapper[4907]: I0226 15:47:37.325748 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerDied","Data":"9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d"} Feb 26 15:47:40 crc kubenswrapper[4907]: I0226 15:47:40.353271 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerStarted","Data":"039fc180708e826a32d5204f4264759ec492eb362dbd9933487c69228ef5f58a"} Feb 26 15:47:40 crc kubenswrapper[4907]: I0226 15:47:40.376973 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtqzb" podStartSLOduration=2.726091221 podStartE2EDuration="1m7.376952006s" podCreationTimestamp="2026-02-26 15:46:33 +0000 UTC" firstStartedPulling="2026-02-26 15:46:35.334550453 +0000 UTC m=+257.853112292" lastFinishedPulling="2026-02-26 15:47:39.985411238 +0000 UTC m=+322.503973077" observedRunningTime="2026-02-26 15:47:40.373026026 +0000 UTC m=+322.891587945" watchObservedRunningTime="2026-02-26 15:47:40.376952006 +0000 UTC m=+322.895513865" Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.372565 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhfr7" event={"ID":"34138ff4-16e6-4f79-bd8f-0c8cb132ebde","Type":"ContainerStarted","Data":"3cadbc6051b9d1b0b5f20f3f0447fbaa03257753484f286df360003c20bd0643"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.375331 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerStarted","Data":"aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.377249 4907 generic.go:334] "Generic (PLEG): container finished" podID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerID="337eab21d91536771f9db3b8bc9e6c75eb59aa9d86381d97d7e4d96004617014" exitCode=0 Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.377326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22zr8" event={"ID":"4d3f9fc7-85b9-4095-af0d-7993e681ab2a","Type":"ContainerDied","Data":"337eab21d91536771f9db3b8bc9e6c75eb59aa9d86381d97d7e4d96004617014"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.379683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerStarted","Data":"820d19843e9f8b3dde2255ec1f62105710e7b3e79ad020bae31022182cfd8324"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.382384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerStarted","Data":"5b6bba62015d7f1e8bca64181979b1590f0fdcc51bc221dc9e17782f8f30c36e"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.384789 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0e96b15-45f7-47f1-878e-57914ef18916" containerID="b0eccf1b45b5e24d81664d8f91f70b8dbe57b62bf009ff26d5fce1594fe459fc" exitCode=0 Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.384830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqxjz" event={"ID":"e0e96b15-45f7-47f1-878e-57914ef18916","Type":"ContainerDied","Data":"b0eccf1b45b5e24d81664d8f91f70b8dbe57b62bf009ff26d5fce1594fe459fc"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.387056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerStarted","Data":"27fc85274312d14655440bdd5823fceaeff047f1528a4370fcb212cab5f45070"} Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.401704 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qhfr7" podStartSLOduration=3.8491418619999997 podStartE2EDuration="1m14.401690558s" podCreationTimestamp="2026-02-26 15:46:29 +0000 UTC" firstStartedPulling="2026-02-26 15:46:32.097208284 +0000 UTC m=+254.615770133" lastFinishedPulling="2026-02-26 15:47:42.64975696 +0000 UTC m=+325.168318829" observedRunningTime="2026-02-26 15:47:43.40061365 +0000 UTC m=+325.919175499" watchObservedRunningTime="2026-02-26 15:47:43.401690558 +0000 UTC m=+325.920252407" Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.436532 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mnd2r" podStartSLOduration=3.929850309 podStartE2EDuration="1m11.436514813s" podCreationTimestamp="2026-02-26 15:46:32 +0000 UTC" firstStartedPulling="2026-02-26 15:46:35.333336384 +0000 UTC m=+257.851898233" lastFinishedPulling="2026-02-26 15:47:42.840000888 +0000 UTC m=+325.358562737" observedRunningTime="2026-02-26 15:47:43.436121602 +0000 UTC m=+325.954683451" watchObservedRunningTime="2026-02-26 15:47:43.436514813 +0000 UTC m=+325.955076662" Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.506393 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.506697 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:47:43 crc kubenswrapper[4907]: I0226 15:47:43.518161 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fcwbm" podStartSLOduration=4.116991633 podStartE2EDuration="1m12.51814498s" podCreationTimestamp="2026-02-26 15:46:31 +0000 UTC" firstStartedPulling="2026-02-26 15:46:34.246245392 +0000 UTC m=+256.764807241" lastFinishedPulling="2026-02-26 15:47:42.647398729 +0000 UTC m=+325.165960588" observedRunningTime="2026-02-26 15:47:43.517496183 +0000 UTC m=+326.036058032" watchObservedRunningTime="2026-02-26 15:47:43.51814498 +0000 UTC m=+326.036706829" Feb 26 15:47:44 crc kubenswrapper[4907]: I0226 15:47:44.392786 4907 generic.go:334] "Generic (PLEG): container finished" podID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerID="820d19843e9f8b3dde2255ec1f62105710e7b3e79ad020bae31022182cfd8324" exitCode=0 Feb 26 15:47:44 crc kubenswrapper[4907]: I0226 15:47:44.392844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerDied","Data":"820d19843e9f8b3dde2255ec1f62105710e7b3e79ad020bae31022182cfd8324"} Feb 26 15:47:44 crc kubenswrapper[4907]: I0226 15:47:44.395854 4907 generic.go:334] "Generic (PLEG): container finished" podID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerID="5b6bba62015d7f1e8bca64181979b1590f0fdcc51bc221dc9e17782f8f30c36e" exitCode=0 Feb 26 15:47:44 crc kubenswrapper[4907]: I0226 15:47:44.395905 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerDied","Data":"5b6bba62015d7f1e8bca64181979b1590f0fdcc51bc221dc9e17782f8f30c36e"} Feb 26 15:47:45 crc kubenswrapper[4907]: I0226 15:47:45.116067 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtqzb" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="registry-server" probeResult="failure" output=< Feb 26 15:47:45 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 15:47:45 crc kubenswrapper[4907]: > Feb 26 15:47:46 crc kubenswrapper[4907]: I0226 15:47:46.408292 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqxjz" event={"ID":"e0e96b15-45f7-47f1-878e-57914ef18916","Type":"ContainerStarted","Data":"f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66"} Feb 26 15:47:46 crc kubenswrapper[4907]: I0226 15:47:46.410565 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22zr8" event={"ID":"4d3f9fc7-85b9-4095-af0d-7993e681ab2a","Type":"ContainerStarted","Data":"c72c74a6fe179f86c2339265699491607cd58e186d909b6af9e06f9ddcbd3100"} Feb 26 15:47:46 crc kubenswrapper[4907]: I0226 15:47:46.427559 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tqxjz" podStartSLOduration=3.614545571 podStartE2EDuration="1m17.427542754s" podCreationTimestamp="2026-02-26 15:46:29 +0000 UTC" firstStartedPulling="2026-02-26 15:46:32.091285673 +0000 UTC m=+254.609847522" lastFinishedPulling="2026-02-26 15:47:45.904282856 +0000 UTC m=+328.422844705" observedRunningTime="2026-02-26 15:47:46.426796485 +0000 UTC m=+328.945358334" watchObservedRunningTime="2026-02-26 15:47:46.427542754 +0000 UTC m=+328.946104603" Feb 26 15:47:47 crc kubenswrapper[4907]: I0226 15:47:47.418810 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerStarted","Data":"ff495918e96a3698db9a9a8dd4dd7887a76c7f0afe3392521131c07da299b110"} Feb 26 15:47:47 crc kubenswrapper[4907]: I0226 15:47:47.422262 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerStarted","Data":"98b5dc93a44069ead28dd6acf75ccfb89db095f25683cbe5525ec8594c89d9f7"} Feb 26 15:47:47 crc kubenswrapper[4907]: I0226 15:47:47.439750 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-68qpc" podStartSLOduration=4.474481086 podStartE2EDuration="1m15.439729455s" podCreationTimestamp="2026-02-26 15:46:32 +0000 UTC" firstStartedPulling="2026-02-26 15:46:35.333062928 +0000 UTC m=+257.851624777" lastFinishedPulling="2026-02-26 15:47:46.298311297 +0000 UTC m=+328.816873146" observedRunningTime="2026-02-26 15:47:47.438384261 +0000 UTC m=+329.956946110" watchObservedRunningTime="2026-02-26 15:47:47.439729455 +0000 UTC m=+329.958291304" Feb 26 15:47:47 crc kubenswrapper[4907]: I0226 15:47:47.440655 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-22zr8" podStartSLOduration=4.872506547 podStartE2EDuration="1m18.440647688s" podCreationTimestamp="2026-02-26 15:46:29 +0000 UTC" firstStartedPulling="2026-02-26 15:46:32.146233048 +0000 UTC m=+254.664794907" lastFinishedPulling="2026-02-26 15:47:45.714374199 +0000 UTC m=+328.232936048" observedRunningTime="2026-02-26 15:47:46.445695232 +0000 UTC m=+328.964257081" watchObservedRunningTime="2026-02-26 15:47:47.440647688 +0000 UTC m=+329.959209537" Feb 26 15:47:47 crc kubenswrapper[4907]: I0226 15:47:47.465327 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2v8kx" podStartSLOduration=2.977723674 podStartE2EDuration="1m17.46531212s" podCreationTimestamp="2026-02-26 15:46:30 +0000 UTC" firstStartedPulling="2026-02-26 15:46:32.071293659 +0000 UTC m=+254.589855508" lastFinishedPulling="2026-02-26 15:47:46.558882105 +0000 UTC m=+329.077443954" observedRunningTime="2026-02-26 15:47:47.463363731 +0000 UTC m=+329.981925600" watchObservedRunningTime="2026-02-26 15:47:47.46531212 +0000 UTC m=+329.983873969" Feb 26 15:47:48 crc kubenswrapper[4907]: I0226 15:47:48.389845 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66ff5595c7-g629l"] Feb 26 15:47:48 crc kubenswrapper[4907]: I0226 15:47:48.390047 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" podUID="48b4caaa-95bc-41de-9716-baf47a347bfa" containerName="controller-manager" containerID="cri-o://b51cfbdad0fb09157d6f232abf6c434d15cefcaa6c7a93e930c0f2b41e5897d0" gracePeriod=30 Feb 26 15:47:48 crc kubenswrapper[4907]: I0226 15:47:48.488543 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr"] Feb 26 15:47:48 crc kubenswrapper[4907]: I0226 15:47:48.489348 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" podUID="1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" containerName="route-controller-manager" containerID="cri-o://8d96943391f455bcf54423a6b5dcb5e6f40042b26c6f185fcc13fef35b5f5ad0" gracePeriod=30 Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.437527 4907 generic.go:334] "Generic (PLEG): container finished" podID="1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" containerID="8d96943391f455bcf54423a6b5dcb5e6f40042b26c6f185fcc13fef35b5f5ad0" exitCode=0 Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.437893 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" event={"ID":"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5","Type":"ContainerDied","Data":"8d96943391f455bcf54423a6b5dcb5e6f40042b26c6f185fcc13fef35b5f5ad0"} Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.439291 4907 generic.go:334] "Generic (PLEG): container finished" podID="48b4caaa-95bc-41de-9716-baf47a347bfa" containerID="b51cfbdad0fb09157d6f232abf6c434d15cefcaa6c7a93e930c0f2b41e5897d0" exitCode=0 Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.439318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" event={"ID":"48b4caaa-95bc-41de-9716-baf47a347bfa","Type":"ContainerDied","Data":"b51cfbdad0fb09157d6f232abf6c434d15cefcaa6c7a93e930c0f2b41e5897d0"} Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.597766 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.666367 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688280 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-585d7c4c78-7ksjg"] Feb 26 15:47:49 crc kubenswrapper[4907]: E0226 15:47:49.688491 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48b4caaa-95bc-41de-9716-baf47a347bfa" containerName="controller-manager" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688501 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="48b4caaa-95bc-41de-9716-baf47a347bfa" containerName="controller-manager" Feb 26 15:47:49 crc kubenswrapper[4907]: E0226 15:47:49.688515 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f1053f-25e8-4308-b15e-ca530e8118ab" containerName="pruner" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688521 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f1053f-25e8-4308-b15e-ca530e8118ab" containerName="pruner" Feb 26 15:47:49 crc kubenswrapper[4907]: E0226 15:47:49.688535 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" containerName="route-controller-manager" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688543 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" containerName="route-controller-manager" Feb 26 15:47:49 crc kubenswrapper[4907]: E0226 15:47:49.688552 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6986b68-4a8d-4677-bed1-493eb1a231c3" containerName="oc" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688557 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6986b68-4a8d-4677-bed1-493eb1a231c3" containerName="oc" Feb 26 15:47:49 crc kubenswrapper[4907]: E0226 15:47:49.688567 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" containerName="oc" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688574 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" containerName="oc" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688684 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="48b4caaa-95bc-41de-9716-baf47a347bfa" containerName="controller-manager" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688695 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" containerName="oc" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688705 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f1053f-25e8-4308-b15e-ca530e8118ab" containerName="pruner" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688718 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" containerName="route-controller-manager" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.688725 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6986b68-4a8d-4677-bed1-493eb1a231c3" containerName="oc" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.689067 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.717153 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585d7c4c78-7ksjg"] Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.717796 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4caaa-95bc-41de-9716-baf47a347bfa-serving-cert\") pod \"48b4caaa-95bc-41de-9716-baf47a347bfa\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.717928 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-client-ca\") pod \"48b4caaa-95bc-41de-9716-baf47a347bfa\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.718005 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-config\") pod \"48b4caaa-95bc-41de-9716-baf47a347bfa\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.718082 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47b6d\" (UniqueName: \"kubernetes.io/projected/48b4caaa-95bc-41de-9716-baf47a347bfa-kube-api-access-47b6d\") pod \"48b4caaa-95bc-41de-9716-baf47a347bfa\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.718156 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-proxy-ca-bundles\") pod \"48b4caaa-95bc-41de-9716-baf47a347bfa\" (UID: \"48b4caaa-95bc-41de-9716-baf47a347bfa\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.718739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "48b4caaa-95bc-41de-9716-baf47a347bfa" (UID: "48b4caaa-95bc-41de-9716-baf47a347bfa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.718773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-client-ca" (OuterVolumeSpecName: "client-ca") pod "48b4caaa-95bc-41de-9716-baf47a347bfa" (UID: "48b4caaa-95bc-41de-9716-baf47a347bfa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.719253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-config" (OuterVolumeSpecName: "config") pod "48b4caaa-95bc-41de-9716-baf47a347bfa" (UID: "48b4caaa-95bc-41de-9716-baf47a347bfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.730826 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48b4caaa-95bc-41de-9716-baf47a347bfa-kube-api-access-47b6d" (OuterVolumeSpecName: "kube-api-access-47b6d") pod "48b4caaa-95bc-41de-9716-baf47a347bfa" (UID: "48b4caaa-95bc-41de-9716-baf47a347bfa"). InnerVolumeSpecName "kube-api-access-47b6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.731202 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48b4caaa-95bc-41de-9716-baf47a347bfa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "48b4caaa-95bc-41de-9716-baf47a347bfa" (UID: "48b4caaa-95bc-41de-9716-baf47a347bfa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.819857 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-serving-cert\") pod \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.819968 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-config\") pod \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.820861 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-client-ca\") pod \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.820791 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-config" (OuterVolumeSpecName: "config") pod "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" (UID: "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.821371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-client-ca" (OuterVolumeSpecName: "client-ca") pod "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" (UID: "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.821440 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dztdz\" (UniqueName: \"kubernetes.io/projected/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-kube-api-access-dztdz\") pod \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\" (UID: \"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5\") " Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822001 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-proxy-ca-bundles\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822068 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-config\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822134 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzhxw\" (UniqueName: \"kubernetes.io/projected/40555409-ee5f-45d8-9112-e3f5864d93aa-kube-api-access-gzhxw\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822160 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-client-ca\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40555409-ee5f-45d8-9112-e3f5864d93aa-serving-cert\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822236 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/48b4caaa-95bc-41de-9716-baf47a347bfa-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822246 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822254 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822262 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47b6d\" (UniqueName: \"kubernetes.io/projected/48b4caaa-95bc-41de-9716-baf47a347bfa-kube-api-access-47b6d\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822288 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/48b4caaa-95bc-41de-9716-baf47a347bfa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822297 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.822305 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.824763 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-kube-api-access-dztdz" (OuterVolumeSpecName: "kube-api-access-dztdz") pod "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" (UID: "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5"). InnerVolumeSpecName "kube-api-access-dztdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.824771 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" (UID: "1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.926118 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-config\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.923851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-config\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.926259 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzhxw\" (UniqueName: \"kubernetes.io/projected/40555409-ee5f-45d8-9112-e3f5864d93aa-kube-api-access-gzhxw\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.926729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-client-ca\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.927849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-client-ca\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.927933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40555409-ee5f-45d8-9112-e3f5864d93aa-serving-cert\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.928056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-proxy-ca-bundles\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.928119 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dztdz\" (UniqueName: \"kubernetes.io/projected/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-kube-api-access-dztdz\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.928140 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.931655 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-proxy-ca-bundles\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.933083 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40555409-ee5f-45d8-9112-e3f5864d93aa-serving-cert\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:49 crc kubenswrapper[4907]: I0226 15:47:49.952420 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzhxw\" (UniqueName: \"kubernetes.io/projected/40555409-ee5f-45d8-9112-e3f5864d93aa-kube-api-access-gzhxw\") pod \"controller-manager-585d7c4c78-7ksjg\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.004259 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.004559 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.005211 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.098733 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.362135 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.362223 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.421540 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.432481 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-585d7c4c78-7ksjg"] Feb 26 15:47:50 crc kubenswrapper[4907]: W0226 15:47:50.439276 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40555409_ee5f_45d8_9112_e3f5864d93aa.slice/crio-4f4072c1845ec548e405023dc051c60b186e8263bee05480d7db43ad305020d1 WatchSource:0}: Error finding container 4f4072c1845ec548e405023dc051c60b186e8263bee05480d7db43ad305020d1: Status 404 returned error can't find the container with id 4f4072c1845ec548e405023dc051c60b186e8263bee05480d7db43ad305020d1 Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.446886 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" event={"ID":"40555409-ee5f-45d8-9112-e3f5864d93aa","Type":"ContainerStarted","Data":"4f4072c1845ec548e405023dc051c60b186e8263bee05480d7db43ad305020d1"} Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.450241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" event={"ID":"48b4caaa-95bc-41de-9716-baf47a347bfa","Type":"ContainerDied","Data":"1fc44381ced63d986d8e42ef89d07c4bf7d86842151dcccb8283d4c25f5f5cf2"} Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.450331 4907 scope.go:117] "RemoveContainer" containerID="b51cfbdad0fb09157d6f232abf6c434d15cefcaa6c7a93e930c0f2b41e5897d0" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.450653 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66ff5595c7-g629l" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.459837 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.460382 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr" event={"ID":"1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5","Type":"ContainerDied","Data":"4a86787ff3194ba3268101e9f267edcf222a996d42471fcd0eb0361dcb7b1e5f"} Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.463801 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.463833 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.475335 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66ff5595c7-g629l"] Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.479773 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66ff5595c7-g629l"] Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.507870 4907 scope.go:117] "RemoveContainer" containerID="8d96943391f455bcf54423a6b5dcb5e6f40042b26c6f185fcc13fef35b5f5ad0" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.513813 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr"] Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.517065 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d65b5cf-tmtfr"] Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.526987 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.541365 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.559190 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.559226 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:47:50 crc kubenswrapper[4907]: I0226 15:47:50.614059 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.465639 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" event={"ID":"40555409-ee5f-45d8-9112-e3f5864d93aa","Type":"ContainerStarted","Data":"6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061"} Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.465860 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.478326 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.658013 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" podStartSLOduration=3.657997131 podStartE2EDuration="3.657997131s" podCreationTimestamp="2026-02-26 15:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:51.494629484 +0000 UTC m=+334.013191333" watchObservedRunningTime="2026-02-26 15:47:51.657997131 +0000 UTC m=+334.176558980" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.696622 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.696905 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.875998 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4"] Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.876665 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.879027 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.882044 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.882250 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.882401 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.882533 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.882653 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.894665 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4"] Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.955787 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-client-ca\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.955827 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-config\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.955864 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxngl\" (UniqueName: \"kubernetes.io/projected/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-kube-api-access-wxngl\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:51 crc kubenswrapper[4907]: I0226 15:47:51.955956 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-serving-cert\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.057009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-client-ca\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.057104 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-config\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.057155 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxngl\" (UniqueName: \"kubernetes.io/projected/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-kube-api-access-wxngl\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.057218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-serving-cert\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.057996 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-client-ca\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.058290 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-config\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.069861 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-serving-cert\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.080738 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxngl\" (UniqueName: \"kubernetes.io/projected/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-kube-api-access-wxngl\") pod \"route-controller-manager-7f749d5666-wlmx4\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.135496 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5" path="/var/lib/kubelet/pods/1dcc24b7-fc12-45ae-ae4c-3f5f9579c2b5/volumes" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.136677 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48b4caaa-95bc-41de-9716-baf47a347bfa" path="/var/lib/kubelet/pods/48b4caaa-95bc-41de-9716-baf47a347bfa/volumes" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.137161 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.137193 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.175231 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.192323 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.514199 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.528286 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.528369 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.588249 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:47:52 crc kubenswrapper[4907]: I0226 15:47:52.645642 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4"] Feb 26 15:47:52 crc kubenswrapper[4907]: W0226 15:47:52.653244 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c0a8c3f_a6d1_48fc_bd2f_5f1e771f708f.slice/crio-cfb9e9eb1007e7789a8decda98ff2dcc3dd2409595ca544782f20645f071893b WatchSource:0}: Error finding container cfb9e9eb1007e7789a8decda98ff2dcc3dd2409595ca544782f20645f071893b: Status 404 returned error can't find the container with id cfb9e9eb1007e7789a8decda98ff2dcc3dd2409595ca544782f20645f071893b Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.150823 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.150992 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.163203 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qhfr7"] Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.163532 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qhfr7" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="registry-server" containerID="cri-o://3cadbc6051b9d1b0b5f20f3f0447fbaa03257753484f286df360003c20bd0643" gracePeriod=2 Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.204940 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.481987 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" event={"ID":"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f","Type":"ContainerStarted","Data":"cfb9e9eb1007e7789a8decda98ff2dcc3dd2409595ca544782f20645f071893b"} Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.543608 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.543820 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.584444 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:47:53 crc kubenswrapper[4907]: I0226 15:47:53.650093 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:47:54 crc kubenswrapper[4907]: I0226 15:47:54.490119 4907 generic.go:334] "Generic (PLEG): container finished" podID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerID="3cadbc6051b9d1b0b5f20f3f0447fbaa03257753484f286df360003c20bd0643" exitCode=0 Feb 26 15:47:54 crc kubenswrapper[4907]: I0226 15:47:54.490186 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhfr7" event={"ID":"34138ff4-16e6-4f79-bd8f-0c8cb132ebde","Type":"ContainerDied","Data":"3cadbc6051b9d1b0b5f20f3f0447fbaa03257753484f286df360003c20bd0643"} Feb 26 15:47:54 crc kubenswrapper[4907]: I0226 15:47:54.494141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" event={"ID":"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f","Type":"ContainerStarted","Data":"1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2"} Feb 26 15:47:54 crc kubenswrapper[4907]: I0226 15:47:54.494910 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:54 crc kubenswrapper[4907]: I0226 15:47:54.500018 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:47:54 crc kubenswrapper[4907]: I0226 15:47:54.536534 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" podStartSLOduration=6.53650564 podStartE2EDuration="6.53650564s" podCreationTimestamp="2026-02-26 15:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:47:54.513664215 +0000 UTC m=+337.032226074" watchObservedRunningTime="2026-02-26 15:47:54.53650564 +0000 UTC m=+337.055067489" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.023538 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.200111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-catalog-content\") pod \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.200160 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4g5s\" (UniqueName: \"kubernetes.io/projected/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-kube-api-access-j4g5s\") pod \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.200212 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-utilities\") pod \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\" (UID: \"34138ff4-16e6-4f79-bd8f-0c8cb132ebde\") " Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.201232 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-utilities" (OuterVolumeSpecName: "utilities") pod "34138ff4-16e6-4f79-bd8f-0c8cb132ebde" (UID: "34138ff4-16e6-4f79-bd8f-0c8cb132ebde"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.211434 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-kube-api-access-j4g5s" (OuterVolumeSpecName: "kube-api-access-j4g5s") pod "34138ff4-16e6-4f79-bd8f-0c8cb132ebde" (UID: "34138ff4-16e6-4f79-bd8f-0c8cb132ebde"). InnerVolumeSpecName "kube-api-access-j4g5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.252944 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34138ff4-16e6-4f79-bd8f-0c8cb132ebde" (UID: "34138ff4-16e6-4f79-bd8f-0c8cb132ebde"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.301705 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4g5s\" (UniqueName: \"kubernetes.io/projected/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-kube-api-access-j4g5s\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.301758 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.301776 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34138ff4-16e6-4f79-bd8f-0c8cb132ebde-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.502099 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qhfr7" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.510374 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qhfr7" event={"ID":"34138ff4-16e6-4f79-bd8f-0c8cb132ebde","Type":"ContainerDied","Data":"33f5e5572eeaa2f340c830cfb7b0b9c827655147da7e0a111dea508f91d22b9a"} Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.510416 4907 scope.go:117] "RemoveContainer" containerID="3cadbc6051b9d1b0b5f20f3f0447fbaa03257753484f286df360003c20bd0643" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.534392 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qhfr7"] Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.541399 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qhfr7"] Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.543438 4907 scope.go:117] "RemoveContainer" containerID="33edb84189c94eb903164e1a7747c2c77ba27f9767060389f1218b24ead39322" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.568198 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnd2r"] Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.568402 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mnd2r" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="registry-server" containerID="cri-o://aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545" gracePeriod=2 Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.568910 4907 scope.go:117] "RemoveContainer" containerID="d1e260cd6583d39c97d36772d017b1b00c00e0a1e1e45aa09bf0edbe96a62d09" Feb 26 15:47:55 crc kubenswrapper[4907]: I0226 15:47:55.953451 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.020885 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-catalog-content\") pod \"2dc40859-37ff-41ea-88d7-6131b35ceebf\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.020984 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-utilities\") pod \"2dc40859-37ff-41ea-88d7-6131b35ceebf\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.021014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2p9w\" (UniqueName: \"kubernetes.io/projected/2dc40859-37ff-41ea-88d7-6131b35ceebf-kube-api-access-x2p9w\") pod \"2dc40859-37ff-41ea-88d7-6131b35ceebf\" (UID: \"2dc40859-37ff-41ea-88d7-6131b35ceebf\") " Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.024292 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-utilities" (OuterVolumeSpecName: "utilities") pod "2dc40859-37ff-41ea-88d7-6131b35ceebf" (UID: "2dc40859-37ff-41ea-88d7-6131b35ceebf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.025993 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc40859-37ff-41ea-88d7-6131b35ceebf-kube-api-access-x2p9w" (OuterVolumeSpecName: "kube-api-access-x2p9w") pod "2dc40859-37ff-41ea-88d7-6131b35ceebf" (UID: "2dc40859-37ff-41ea-88d7-6131b35ceebf"). InnerVolumeSpecName "kube-api-access-x2p9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.048415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dc40859-37ff-41ea-88d7-6131b35ceebf" (UID: "2dc40859-37ff-41ea-88d7-6131b35ceebf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.122702 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.122742 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dc40859-37ff-41ea-88d7-6131b35ceebf-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.122753 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2p9w\" (UniqueName: \"kubernetes.io/projected/2dc40859-37ff-41ea-88d7-6131b35ceebf-kube-api-access-x2p9w\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.136302 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" path="/var/lib/kubelet/pods/34138ff4-16e6-4f79-bd8f-0c8cb132ebde/volumes" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.514200 4907 generic.go:334] "Generic (PLEG): container finished" podID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerID="aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545" exitCode=0 Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.514260 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnd2r" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.514378 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerDied","Data":"aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545"} Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.514429 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnd2r" event={"ID":"2dc40859-37ff-41ea-88d7-6131b35ceebf","Type":"ContainerDied","Data":"d213f4457a2579dec5c7b3920056759cd35ed19b2e4b7c425c5b36500b1d2818"} Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.514460 4907 scope.go:117] "RemoveContainer" containerID="aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.534537 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnd2r"] Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.539173 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnd2r"] Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.543744 4907 scope.go:117] "RemoveContainer" containerID="9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.561780 4907 scope.go:117] "RemoveContainer" containerID="b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.585223 4907 scope.go:117] "RemoveContainer" containerID="aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545" Feb 26 15:47:56 crc kubenswrapper[4907]: E0226 15:47:56.585976 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545\": container with ID starting with aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545 not found: ID does not exist" containerID="aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.586009 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545"} err="failed to get container status \"aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545\": rpc error: code = NotFound desc = could not find container \"aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545\": container with ID starting with aa76c720fd0f4be368435500c1b661ff90dff2f494a4ae3c46c6ce68cb78e545 not found: ID does not exist" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.586032 4907 scope.go:117] "RemoveContainer" containerID="9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d" Feb 26 15:47:56 crc kubenswrapper[4907]: E0226 15:47:56.586409 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d\": container with ID starting with 9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d not found: ID does not exist" containerID="9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.586467 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d"} err="failed to get container status \"9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d\": rpc error: code = NotFound desc = could not find container \"9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d\": container with ID starting with 9698a1b1483790195ac54a02e8c7c6bfd639dbf1de7a3de5ca74dbf607b58c9d not found: ID does not exist" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.586494 4907 scope.go:117] "RemoveContainer" containerID="b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd" Feb 26 15:47:56 crc kubenswrapper[4907]: E0226 15:47:56.586906 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd\": container with ID starting with b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd not found: ID does not exist" containerID="b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd" Feb 26 15:47:56 crc kubenswrapper[4907]: I0226 15:47:56.586939 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd"} err="failed to get container status \"b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd\": rpc error: code = NotFound desc = could not find container \"b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd\": container with ID starting with b8c1899a4564f83f7588b8b70cb4035a9fd21a37c516350a3053e71511e8e3dd not found: ID does not exist" Feb 26 15:47:57 crc kubenswrapper[4907]: I0226 15:47:57.760794 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtqzb"] Feb 26 15:47:57 crc kubenswrapper[4907]: I0226 15:47:57.761939 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtqzb" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="registry-server" containerID="cri-o://039fc180708e826a32d5204f4264759ec492eb362dbd9933487c69228ef5f58a" gracePeriod=2 Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.135230 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" path="/var/lib/kubelet/pods/2dc40859-37ff-41ea-88d7-6131b35ceebf/volumes" Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.545368 4907 generic.go:334] "Generic (PLEG): container finished" podID="8eefa350-bfa6-48dc-9577-692787482b0d" containerID="039fc180708e826a32d5204f4264759ec492eb362dbd9933487c69228ef5f58a" exitCode=0 Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.545409 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerDied","Data":"039fc180708e826a32d5204f4264759ec492eb362dbd9933487c69228ef5f58a"} Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.846075 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.860711 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-catalog-content\") pod \"8eefa350-bfa6-48dc-9577-692787482b0d\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.860765 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-utilities\") pod \"8eefa350-bfa6-48dc-9577-692787482b0d\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.860798 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nq7k\" (UniqueName: \"kubernetes.io/projected/8eefa350-bfa6-48dc-9577-692787482b0d-kube-api-access-9nq7k\") pod \"8eefa350-bfa6-48dc-9577-692787482b0d\" (UID: \"8eefa350-bfa6-48dc-9577-692787482b0d\") " Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.863547 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-utilities" (OuterVolumeSpecName: "utilities") pod "8eefa350-bfa6-48dc-9577-692787482b0d" (UID: "8eefa350-bfa6-48dc-9577-692787482b0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.889306 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eefa350-bfa6-48dc-9577-692787482b0d-kube-api-access-9nq7k" (OuterVolumeSpecName: "kube-api-access-9nq7k") pod "8eefa350-bfa6-48dc-9577-692787482b0d" (UID: "8eefa350-bfa6-48dc-9577-692787482b0d"). InnerVolumeSpecName "kube-api-access-9nq7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.966458 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:58 crc kubenswrapper[4907]: I0226 15:47:58.966497 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nq7k\" (UniqueName: \"kubernetes.io/projected/8eefa350-bfa6-48dc-9577-692787482b0d-kube-api-access-9nq7k\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.041344 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8eefa350-bfa6-48dc-9577-692787482b0d" (UID: "8eefa350-bfa6-48dc-9577-692787482b0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.067468 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8eefa350-bfa6-48dc-9577-692787482b0d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.559884 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtqzb" event={"ID":"8eefa350-bfa6-48dc-9577-692787482b0d","Type":"ContainerDied","Data":"5bf8fbdce5b911b359573c2625cc1312b81286ca021c7699751bb6b103ed776d"} Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.559961 4907 scope.go:117] "RemoveContainer" containerID="039fc180708e826a32d5204f4264759ec492eb362dbd9933487c69228ef5f58a" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.559979 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtqzb" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.586990 4907 scope.go:117] "RemoveContainer" containerID="11c032adc51a3bf30ce404d20de56bd3c614dde407a0177627e4b9ede529c291" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.625986 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtqzb"] Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.627553 4907 scope.go:117] "RemoveContainer" containerID="7cc4bb476a32190ea78b129c131915eacb611b4a303edb9ebc391b79b07fe2a9" Feb 26 15:47:59 crc kubenswrapper[4907]: I0226 15:47:59.631603 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtqzb"] Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.139467 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" path="/var/lib/kubelet/pods/8eefa350-bfa6-48dc-9577-692787482b0d/volumes" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141121 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535348-8k2tp"] Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141427 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="extract-content" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141459 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="extract-content" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141484 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="extract-utilities" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141500 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="extract-utilities" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141543 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="extract-content" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141556 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="extract-content" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141615 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="extract-utilities" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141627 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="extract-utilities" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141658 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141670 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141689 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="extract-utilities" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141701 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="extract-utilities" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141716 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141728 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141748 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="extract-content" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141760 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="extract-content" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.141778 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.141789 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.144540 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eefa350-bfa6-48dc-9577-692787482b0d" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.144658 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="34138ff4-16e6-4f79-bd8f-0c8cb132ebde" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.144695 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dc40859-37ff-41ea-88d7-6131b35ceebf" containerName="registry-server" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.145498 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.148142 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.148461 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.148502 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.149295 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-8k2tp"] Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.283476 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jsm\" (UniqueName: \"kubernetes.io/projected/6e761f1c-0a31-49e0-aee3-2ecd184291dc-kube-api-access-46jsm\") pod \"auto-csr-approver-29535348-8k2tp\" (UID: \"6e761f1c-0a31-49e0-aee3-2ecd184291dc\") " pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.304158 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lr7kc"] Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.385318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jsm\" (UniqueName: \"kubernetes.io/projected/6e761f1c-0a31-49e0-aee3-2ecd184291dc-kube-api-access-46jsm\") pod \"auto-csr-approver-29535348-8k2tp\" (UID: \"6e761f1c-0a31-49e0-aee3-2ecd184291dc\") " pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.417183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jsm\" (UniqueName: \"kubernetes.io/projected/6e761f1c-0a31-49e0-aee3-2ecd184291dc-kube-api-access-46jsm\") pod \"auto-csr-approver-29535348-8k2tp\" (UID: \"6e761f1c-0a31-49e0-aee3-2ecd184291dc\") " pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.472974 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.610727 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.611735 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.624248 4907 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.624251 4907 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.624598 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c" gracePeriod=15 Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.624611 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7" gracePeriod=15 Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.624632 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11" gracePeriod=15 Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.624615 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727" gracePeriod=15 Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.624713 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556" gracePeriod=15 Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626550 4907 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626820 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626834 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626846 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626855 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626865 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626873 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626881 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626890 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626899 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626906 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626916 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626923 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626934 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626942 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626958 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626966 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626976 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626983 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.626992 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.626999 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627108 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627123 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627132 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627140 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627149 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627159 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627169 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627176 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627187 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 15:48:00 crc kubenswrapper[4907]: E0226 15:48:00.627294 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627303 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.627439 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.630084 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.683978 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.803993 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804070 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804091 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804110 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804126 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804214 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.804235 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.905843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.906058 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.906538 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.906859 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.906679 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.906931 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907004 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907013 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907045 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907054 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907083 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907099 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907270 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907314 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.907376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:00 crc kubenswrapper[4907]: I0226 15:48:00.974808 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:01 crc kubenswrapper[4907]: W0226 15:48:01.030574 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-2e1d23ed512934c7f1b2a61abf70a987b2f6d092d482c52a0eadb2b8b80fd6e7 WatchSource:0}: Error finding container 2e1d23ed512934c7f1b2a61abf70a987b2f6d092d482c52a0eadb2b8b80fd6e7: Status 404 returned error can't find the container with id 2e1d23ed512934c7f1b2a61abf70a987b2f6d092d482c52a0eadb2b8b80fd6e7 Feb 26 15:48:01 crc kubenswrapper[4907]: E0226 15:48:01.035762 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.210:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897d68d5d21c5eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:48:01.034806763 +0000 UTC m=+343.553368652,LastTimestamp:2026-02-26 15:48:01.034806763 +0000 UTC m=+343.553368652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:48:01 crc kubenswrapper[4907]: E0226 15:48:01.409028 4907 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 15:48:01 crc kubenswrapper[4907]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7" Netns:"/var/run/netns/c61e9967-016b-44d2-8b8d-f2ec3ed64bce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:48:01 crc kubenswrapper[4907]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 15:48:01 crc kubenswrapper[4907]: > Feb 26 15:48:01 crc kubenswrapper[4907]: E0226 15:48:01.409115 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 15:48:01 crc kubenswrapper[4907]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7" Netns:"/var/run/netns/c61e9967-016b-44d2-8b8d-f2ec3ed64bce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:48:01 crc kubenswrapper[4907]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 15:48:01 crc kubenswrapper[4907]: > pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:01 crc kubenswrapper[4907]: E0226 15:48:01.409146 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 26 15:48:01 crc kubenswrapper[4907]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7" Netns:"/var/run/netns/c61e9967-016b-44d2-8b8d-f2ec3ed64bce" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:48:01 crc kubenswrapper[4907]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 15:48:01 crc kubenswrapper[4907]: > pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:01 crc kubenswrapper[4907]: E0226 15:48:01.409274 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535348-8k2tp_openshift-infra(6e761f1c-0a31-49e0-aee3-2ecd184291dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535348-8k2tp_openshift-infra(6e761f1c-0a31-49e0-aee3-2ecd184291dc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7\\\" Netns:\\\"/var/run/netns/c61e9967-016b-44d2-8b8d-f2ec3ed64bce\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=1d326dc83ade5967bbfea250745032d9384e03fdd023d515f87615a1eacd73d7;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s\\\": dial tcp 38.102.83.210:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" podUID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.580549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b4f9fde9bfda905320bf8cf8897c11cf190969f123ff5644b5c3c62ce53613c4"} Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.580966 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2e1d23ed512934c7f1b2a61abf70a987b2f6d092d482c52a0eadb2b8b80fd6e7"} Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.582864 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/4.log" Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.584797 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.585559 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7" exitCode=0 Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.585619 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11" exitCode=0 Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.585636 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727" exitCode=0 Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.585648 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556" exitCode=2 Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.585688 4907 scope.go:117] "RemoveContainer" containerID="a3c61b08bda7c918a3fa7b01e6f80515ee05a5746e189e829d2872c181b80c85" Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.588312 4907 generic.go:334] "Generic (PLEG): container finished" podID="abc73ba8-89c5-4844-a81e-742468c4366c" containerID="61366141be31e3250da68ac97435813984e0e5f56c778448271c955a8e8ad5b1" exitCode=0 Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.588382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.588788 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:01 crc kubenswrapper[4907]: I0226 15:48:01.588956 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"abc73ba8-89c5-4844-a81e-742468c4366c","Type":"ContainerDied","Data":"61366141be31e3250da68ac97435813984e0e5f56c778448271c955a8e8ad5b1"} Feb 26 15:48:02 crc kubenswrapper[4907]: E0226 15:48:02.377042 4907 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 15:48:02 crc kubenswrapper[4907]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784" Netns:"/var/run/netns/5a7ec994-fe09-46be-8ca7-48ee80521c3c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:48:02 crc kubenswrapper[4907]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 15:48:02 crc kubenswrapper[4907]: > Feb 26 15:48:02 crc kubenswrapper[4907]: E0226 15:48:02.377347 4907 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 15:48:02 crc kubenswrapper[4907]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784" Netns:"/var/run/netns/5a7ec994-fe09-46be-8ca7-48ee80521c3c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:48:02 crc kubenswrapper[4907]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 15:48:02 crc kubenswrapper[4907]: > pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:02 crc kubenswrapper[4907]: E0226 15:48:02.377368 4907 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 26 15:48:02 crc kubenswrapper[4907]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784" Netns:"/var/run/netns/5a7ec994-fe09-46be-8ca7-48ee80521c3c" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc" Path:"" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s": dial tcp 38.102.83.210:6443: connect: connection refused Feb 26 15:48:02 crc kubenswrapper[4907]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 15:48:02 crc kubenswrapper[4907]: > pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:02 crc kubenswrapper[4907]: E0226 15:48:02.377417 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29535348-8k2tp_openshift-infra(6e761f1c-0a31-49e0-aee3-2ecd184291dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29535348-8k2tp_openshift-infra(6e761f1c-0a31-49e0-aee3-2ecd184291dc)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29535348-8k2tp_openshift-infra_6e761f1c-0a31-49e0-aee3-2ecd184291dc_0(425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784): error adding pod openshift-infra_auto-csr-approver-29535348-8k2tp to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784\\\" Netns:\\\"/var/run/netns/5a7ec994-fe09-46be-8ca7-48ee80521c3c\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-infra;K8S_POD_NAME=auto-csr-approver-29535348-8k2tp;K8S_POD_INFRA_CONTAINER_ID=425d5d7ade4ccd31eb406a18e9959a0805c17786f306c432f92afc2830471784;K8S_POD_UID=6e761f1c-0a31-49e0-aee3-2ecd184291dc\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-infra/auto-csr-approver-29535348-8k2tp] networking: Multus: [openshift-infra/auto-csr-approver-29535348-8k2tp/6e761f1c-0a31-49e0-aee3-2ecd184291dc]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: SetNetworkStatus: failed to update the pod auto-csr-approver-29535348-8k2tp in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29535348-8k2tp?timeout=1m0s\\\": dial tcp 38.102.83.210:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" podUID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" Feb 26 15:48:02 crc kubenswrapper[4907]: I0226 15:48:02.605632 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.087837 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.092793 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.093510 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.239981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240024 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240088 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-var-lock\") pod \"abc73ba8-89c5-4844-a81e-742468c4366c\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240115 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240112 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240139 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc73ba8-89c5-4844-a81e-742468c4366c-kube-api-access\") pod \"abc73ba8-89c5-4844-a81e-742468c4366c\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240167 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-kubelet-dir\") pod \"abc73ba8-89c5-4844-a81e-742468c4366c\" (UID: \"abc73ba8-89c5-4844-a81e-742468c4366c\") " Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240171 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-var-lock" (OuterVolumeSpecName: "var-lock") pod "abc73ba8-89c5-4844-a81e-742468c4366c" (UID: "abc73ba8-89c5-4844-a81e-742468c4366c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240201 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240310 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "abc73ba8-89c5-4844-a81e-742468c4366c" (UID: "abc73ba8-89c5-4844-a81e-742468c4366c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240357 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240775 4907 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240802 4907 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240824 4907 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240842 4907 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.240865 4907 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/abc73ba8-89c5-4844-a81e-742468c4366c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.248819 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc73ba8-89c5-4844-a81e-742468c4366c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "abc73ba8-89c5-4844-a81e-742468c4366c" (UID: "abc73ba8-89c5-4844-a81e-742468c4366c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.341986 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abc73ba8-89c5-4844-a81e-742468c4366c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.619626 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"abc73ba8-89c5-4844-a81e-742468c4366c","Type":"ContainerDied","Data":"7cdc24871a7d9475407aaff2ed547ec8ab5b2de563852e2e7155029c78a8df0c"} Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.619667 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.619685 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cdc24871a7d9475407aaff2ed547ec8ab5b2de563852e2e7155029c78a8df0c" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.623543 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.624749 4907 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c" exitCode=0 Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.624807 4907 scope.go:117] "RemoveContainer" containerID="3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.624936 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.649848 4907 scope.go:117] "RemoveContainer" containerID="8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.669281 4907 scope.go:117] "RemoveContainer" containerID="64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.684921 4907 scope.go:117] "RemoveContainer" containerID="42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.697790 4907 scope.go:117] "RemoveContainer" containerID="bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.713394 4907 scope.go:117] "RemoveContainer" containerID="7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.743060 4907 scope.go:117] "RemoveContainer" containerID="3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7" Feb 26 15:48:03 crc kubenswrapper[4907]: E0226 15:48:03.744669 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7\": container with ID starting with 3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7 not found: ID does not exist" containerID="3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.744706 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7"} err="failed to get container status \"3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7\": rpc error: code = NotFound desc = could not find container \"3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7\": container with ID starting with 3cfea1638c0926e3aba947161db48db309efe614e7b082a3896c2c6cfc93ffb7 not found: ID does not exist" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.744729 4907 scope.go:117] "RemoveContainer" containerID="8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11" Feb 26 15:48:03 crc kubenswrapper[4907]: E0226 15:48:03.746241 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\": container with ID starting with 8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11 not found: ID does not exist" containerID="8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.746345 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11"} err="failed to get container status \"8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\": rpc error: code = NotFound desc = could not find container \"8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11\": container with ID starting with 8cf7bf0e49be4282c641d1e48be50a327bb418475701bfde61f4249724709e11 not found: ID does not exist" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.746393 4907 scope.go:117] "RemoveContainer" containerID="64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727" Feb 26 15:48:03 crc kubenswrapper[4907]: E0226 15:48:03.746819 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\": container with ID starting with 64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727 not found: ID does not exist" containerID="64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.746868 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727"} err="failed to get container status \"64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\": rpc error: code = NotFound desc = could not find container \"64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727\": container with ID starting with 64e8ac34f3cae799ba04d2bba51c22e4d99cf03261778fe3ba7a2320e661e727 not found: ID does not exist" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.746884 4907 scope.go:117] "RemoveContainer" containerID="42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556" Feb 26 15:48:03 crc kubenswrapper[4907]: E0226 15:48:03.747333 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\": container with ID starting with 42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556 not found: ID does not exist" containerID="42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.747546 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556"} err="failed to get container status \"42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\": rpc error: code = NotFound desc = could not find container \"42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556\": container with ID starting with 42e24dea757f775f836c5c1fdb77c920db85f523bc0a35d2f2fb22e766274556 not found: ID does not exist" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.747840 4907 scope.go:117] "RemoveContainer" containerID="bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c" Feb 26 15:48:03 crc kubenswrapper[4907]: E0226 15:48:03.748326 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\": container with ID starting with bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c not found: ID does not exist" containerID="bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.748352 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c"} err="failed to get container status \"bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\": rpc error: code = NotFound desc = could not find container \"bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c\": container with ID starting with bbc5e8c015ccc6b1a4740c955375e4f995f69ff1f1f698d8e2660ef451da6b8c not found: ID does not exist" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.748717 4907 scope.go:117] "RemoveContainer" containerID="7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19" Feb 26 15:48:03 crc kubenswrapper[4907]: E0226 15:48:03.749485 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\": container with ID starting with 7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19 not found: ID does not exist" containerID="7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19" Feb 26 15:48:03 crc kubenswrapper[4907]: I0226 15:48:03.749808 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19"} err="failed to get container status \"7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\": rpc error: code = NotFound desc = could not find container \"7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19\": container with ID starting with 7ff4ef3cac1d6f77bf9c90ee9a0f1d8fca15084e93afdb4e4e0048cbfe904f19 not found: ID does not exist" Feb 26 15:48:04 crc kubenswrapper[4907]: I0226 15:48:04.135947 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 26 15:48:05 crc kubenswrapper[4907]: I0226 15:48:05.674483 4907 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:05 crc kubenswrapper[4907]: I0226 15:48:05.675320 4907 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:05 crc kubenswrapper[4907]: I0226 15:48:05.676070 4907 status_manager.go:851] "Failed to get status for pod" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:06 crc kubenswrapper[4907]: E0226 15:48:06.397058 4907 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.210:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897d68d5d21c5eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 15:48:01.034806763 +0000 UTC m=+343.553368652,LastTimestamp:2026-02-26 15:48:01.034806763 +0000 UTC m=+343.553368652,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 15:48:08 crc kubenswrapper[4907]: I0226 15:48:08.128945 4907 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:08 crc kubenswrapper[4907]: I0226 15:48:08.129251 4907 status_manager.go:851] "Failed to get status for pod" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.231156 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.231636 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.231792 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.231939 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.232093 4907 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:10 crc kubenswrapper[4907]: I0226 15:48:10.232111 4907 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.232243 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="200ms" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.433353 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="400ms" Feb 26 15:48:10 crc kubenswrapper[4907]: E0226 15:48:10.834409 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="800ms" Feb 26 15:48:11 crc kubenswrapper[4907]: E0226 15:48:11.636379 4907 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.210:6443: connect: connection refused" interval="1.6s" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.125850 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.126518 4907 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.126849 4907 status_manager.go:851] "Failed to get status for pod" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.148324 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.148365 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:12 crc kubenswrapper[4907]: E0226 15:48:12.148739 4907 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.149232 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:12 crc kubenswrapper[4907]: W0226 15:48:12.204709 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-11fea96a4b1be987add648f34d1e3d873b9cd685b29d72bdcd82328b7c35edb8 WatchSource:0}: Error finding container 11fea96a4b1be987add648f34d1e3d873b9cd685b29d72bdcd82328b7c35edb8: Status 404 returned error can't find the container with id 11fea96a4b1be987add648f34d1e3d873b9cd685b29d72bdcd82328b7c35edb8 Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.685381 4907 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="3448c1168bf8b6bf0e0bfd915c9425fcda5483a51bff801e9ddfaf1a60717f0e" exitCode=0 Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.685480 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"3448c1168bf8b6bf0e0bfd915c9425fcda5483a51bff801e9ddfaf1a60717f0e"} Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.685737 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11fea96a4b1be987add648f34d1e3d873b9cd685b29d72bdcd82328b7c35edb8"} Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.686014 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.686031 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:12 crc kubenswrapper[4907]: E0226 15:48:12.686349 4907 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.686694 4907 status_manager.go:851] "Failed to get status for pod" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:12 crc kubenswrapper[4907]: I0226 15:48:12.687012 4907 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.210:6443: connect: connection refused" Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.126209 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.126878 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.700202 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ee4342e9eb8cf2fa3272cf5289a067b21d21f86753e4eda0cae797e9fa15e733"} Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.700455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3cd0f44ddad40b6121ccbdadf3a14125160f634e7ad70bae2e3531385ad479de"} Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.700467 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"20d7cf9c0480fcb2aeba07769c030625a21bba3b058714ca8ca6767f285df779"} Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.700475 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"063f82c788301f7fef74ba282f2f34f6102c6d59ec2a1b66d1af940a9e69f2a3"} Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.702334 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.702777 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.702811 4907 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633" exitCode=1 Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.702832 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633"} Feb 26 15:48:13 crc kubenswrapper[4907]: I0226 15:48:13.703233 4907 scope.go:117] "RemoveContainer" containerID="b4592db3d17945a9ed96383e96902333033b03f395da93754ffbca7d15b1e633" Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.712896 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"59d06921bc55167f4b3598d65d3cba5570bfead5b2f34633f515c6edf3a88f10"} Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.713002 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.713107 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.713131 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.717233 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.717916 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 15:48:14 crc kubenswrapper[4907]: I0226 15:48:14.718001 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0ab34750f76c3e465762c5f08c98c3c165ad0175249dc6d9be1bd4678b31af3"} Feb 26 15:48:16 crc kubenswrapper[4907]: I0226 15:48:16.478513 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:48:16 crc kubenswrapper[4907]: I0226 15:48:16.572301 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:48:16 crc kubenswrapper[4907]: I0226 15:48:16.731010 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:48:17 crc kubenswrapper[4907]: I0226 15:48:17.149524 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:17 crc kubenswrapper[4907]: I0226 15:48:17.149571 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:17 crc kubenswrapper[4907]: I0226 15:48:17.161678 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:19 crc kubenswrapper[4907]: I0226 15:48:19.723248 4907 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:19 crc kubenswrapper[4907]: I0226 15:48:19.750574 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" event={"ID":"6e761f1c-0a31-49e0-aee3-2ecd184291dc","Type":"ContainerStarted","Data":"1749a4f5ed2933bb40580a1ece2037a35e532bb2bdd8eb460409414e73a0cf56"} Feb 26 15:48:19 crc kubenswrapper[4907]: I0226 15:48:19.750921 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:19 crc kubenswrapper[4907]: I0226 15:48:19.750952 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:19 crc kubenswrapper[4907]: I0226 15:48:19.758688 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:19 crc kubenswrapper[4907]: I0226 15:48:19.763834 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c14bb07e-46ac-41bf-b43a-9e65cf52602e" Feb 26 15:48:20 crc kubenswrapper[4907]: I0226 15:48:20.757560 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" event={"ID":"6e761f1c-0a31-49e0-aee3-2ecd184291dc","Type":"ContainerStarted","Data":"78f0f5bcde8332a66f2bd4defbe69a2ebd04385376ea98cbf6d4028de8d7dd06"} Feb 26 15:48:20 crc kubenswrapper[4907]: I0226 15:48:20.757844 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:20 crc kubenswrapper[4907]: I0226 15:48:20.757997 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:21 crc kubenswrapper[4907]: I0226 15:48:21.769072 4907 generic.go:334] "Generic (PLEG): container finished" podID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" containerID="78f0f5bcde8332a66f2bd4defbe69a2ebd04385376ea98cbf6d4028de8d7dd06" exitCode=0 Feb 26 15:48:21 crc kubenswrapper[4907]: I0226 15:48:21.769442 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" event={"ID":"6e761f1c-0a31-49e0-aee3-2ecd184291dc","Type":"ContainerDied","Data":"78f0f5bcde8332a66f2bd4defbe69a2ebd04385376ea98cbf6d4028de8d7dd06"} Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.186729 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.252296 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46jsm\" (UniqueName: \"kubernetes.io/projected/6e761f1c-0a31-49e0-aee3-2ecd184291dc-kube-api-access-46jsm\") pod \"6e761f1c-0a31-49e0-aee3-2ecd184291dc\" (UID: \"6e761f1c-0a31-49e0-aee3-2ecd184291dc\") " Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.257802 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e761f1c-0a31-49e0-aee3-2ecd184291dc-kube-api-access-46jsm" (OuterVolumeSpecName: "kube-api-access-46jsm") pod "6e761f1c-0a31-49e0-aee3-2ecd184291dc" (UID: "6e761f1c-0a31-49e0-aee3-2ecd184291dc"). InnerVolumeSpecName "kube-api-access-46jsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.354350 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46jsm\" (UniqueName: \"kubernetes.io/projected/6e761f1c-0a31-49e0-aee3-2ecd184291dc-kube-api-access-46jsm\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.782679 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" event={"ID":"6e761f1c-0a31-49e0-aee3-2ecd184291dc","Type":"ContainerDied","Data":"1749a4f5ed2933bb40580a1ece2037a35e532bb2bdd8eb460409414e73a0cf56"} Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.782945 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1749a4f5ed2933bb40580a1ece2037a35e532bb2bdd8eb460409414e73a0cf56" Feb 26 15:48:23 crc kubenswrapper[4907]: I0226 15:48:23.782761 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535348-8k2tp" Feb 26 15:48:25 crc kubenswrapper[4907]: I0226 15:48:25.332393 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" podUID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" containerName="oauth-openshift" containerID="cri-o://e7a064d46f10da05acc9a52ec9b08660db6497072fc921c6fcb4b4f75a91b427" gracePeriod=15 Feb 26 15:48:25 crc kubenswrapper[4907]: I0226 15:48:25.796947 4907 generic.go:334] "Generic (PLEG): container finished" podID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" containerID="e7a064d46f10da05acc9a52ec9b08660db6497072fc921c6fcb4b4f75a91b427" exitCode=0 Feb 26 15:48:25 crc kubenswrapper[4907]: I0226 15:48:25.797011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" event={"ID":"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6","Type":"ContainerDied","Data":"e7a064d46f10da05acc9a52ec9b08660db6497072fc921c6fcb4b4f75a91b427"} Feb 26 15:48:25 crc kubenswrapper[4907]: I0226 15:48:25.961211 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.092962 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-idp-0-file-data\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.092999 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-router-certs\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093031 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-serving-cert\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093054 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-provider-selection\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093073 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-ocp-branding-template\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093094 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-trusted-ca-bundle\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-policies\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-error\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093159 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzv9t\" (UniqueName: \"kubernetes.io/projected/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-kube-api-access-lzv9t\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093207 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-login\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-cliconfig\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093273 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-dir\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093292 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-service-ca\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.093311 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-session\") pod \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\" (UID: \"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6\") " Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.095085 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.101224 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.101430 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.101483 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.101786 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.102177 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.102288 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.102371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.106289 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-kube-api-access-lzv9t" (OuterVolumeSpecName: "kube-api-access-lzv9t") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "kube-api-access-lzv9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.109143 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.109223 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.109434 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.117748 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.120627 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" (UID: "ca3ab95b-79df-45b9-9ada-c7c713e2e3e6"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195327 4907 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195374 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195399 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195420 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195439 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195458 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195478 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195497 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195516 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195535 4907 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195552 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.195571 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzv9t\" (UniqueName: \"kubernetes.io/projected/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-kube-api-access-lzv9t\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.196736 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.196824 4907 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.811775 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" event={"ID":"ca3ab95b-79df-45b9-9ada-c7c713e2e3e6","Type":"ContainerDied","Data":"76f1c57e1232c564d9ac0fc7831515e08af8edd2563f06b9c8cfa68af813c51e"} Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.811924 4907 scope.go:117] "RemoveContainer" containerID="e7a064d46f10da05acc9a52ec9b08660db6497072fc921c6fcb4b4f75a91b427" Feb 26 15:48:26 crc kubenswrapper[4907]: I0226 15:48:26.811937 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lr7kc" Feb 26 15:48:27 crc kubenswrapper[4907]: I0226 15:48:27.866891 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 15:48:28 crc kubenswrapper[4907]: I0226 15:48:28.153223 4907 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c14bb07e-46ac-41bf-b43a-9e65cf52602e" Feb 26 15:48:28 crc kubenswrapper[4907]: I0226 15:48:28.255215 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 15:48:28 crc kubenswrapper[4907]: I0226 15:48:28.376671 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 15:48:29 crc kubenswrapper[4907]: I0226 15:48:29.590001 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.027112 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.178237 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.296619 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.416456 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.651188 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.782720 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 15:48:30 crc kubenswrapper[4907]: I0226 15:48:30.846708 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.068218 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.240748 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.385334 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.526132 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.574906 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.630554 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.804780 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.911916 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 15:48:31 crc kubenswrapper[4907]: I0226 15:48:31.958977 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.110145 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.182368 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.299911 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.577831 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.799896 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.957937 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:48:32 crc kubenswrapper[4907]: I0226 15:48:32.961644 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.147130 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.180729 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.243735 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.257161 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.480772 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.837640 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.955239 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.972478 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 15:48:33 crc kubenswrapper[4907]: I0226 15:48:33.986485 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.009112 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.409324 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.523810 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.773219 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.873947 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.879559 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.884908 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.899117 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:48:34 crc kubenswrapper[4907]: I0226 15:48:34.927664 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.025698 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.182284 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.243173 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.268518 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.337709 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.532225 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.582514 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.613720 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 15:48:35 crc kubenswrapper[4907]: I0226 15:48:35.759222 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.018712 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.098408 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.220935 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.243403 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.282478 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.364066 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.375698 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.477705 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.490278 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.652625 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.708643 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.775491 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.841461 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.884993 4907 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 15:48:36 crc kubenswrapper[4907]: I0226 15:48:36.904686 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.069025 4907 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.092208 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.152333 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.201524 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.218569 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.488574 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.530066 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.687313 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.712316 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.716966 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.769916 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.891231 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 15:48:37 crc kubenswrapper[4907]: I0226 15:48:37.962236 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.112214 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.159400 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.229908 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.366563 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.376689 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.438935 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.463302 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.470718 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.574460 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.695457 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.722051 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.756552 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.760403 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.779237 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.855919 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.870620 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.883310 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.997091 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 15:48:38 crc kubenswrapper[4907]: I0226 15:48:38.999883 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.193893 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.240992 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.433253 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.466503 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.504546 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.533928 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.555953 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.558932 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.565623 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.620285 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.625010 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.641197 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.682669 4907 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.684583 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.684567884 podStartE2EDuration="39.684567884s" podCreationTimestamp="2026-02-26 15:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:48:19.665701451 +0000 UTC m=+362.184263310" watchObservedRunningTime="2026-02-26 15:48:39.684567884 +0000 UTC m=+382.203129743" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.687742 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-lr7kc"] Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.687799 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-59b95f96cf-255fr"] Feb 26 15:48:39 crc kubenswrapper[4907]: E0226 15:48:39.687999 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" containerName="installer" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688019 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" containerName="installer" Feb 26 15:48:39 crc kubenswrapper[4907]: E0226 15:48:39.688035 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" containerName="oc" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688045 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" containerName="oc" Feb 26 15:48:39 crc kubenswrapper[4907]: E0226 15:48:39.688058 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" containerName="oauth-openshift" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688067 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" containerName="oauth-openshift" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688209 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" containerName="oc" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688229 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" containerName="oauth-openshift" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688241 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc73ba8-89c5-4844-a81e-742468c4366c" containerName="installer" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688252 4907 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688272 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="27c9ab80-fcc8-4c5a-9d89-c0504e0e6396" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688625 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-8k2tp"] Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.688696 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.696478 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.696741 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.697432 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698424 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698436 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698616 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698712 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698792 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698898 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698911 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.699076 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.698957 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.702310 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.714912 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.715816 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.723760 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.723740751 podStartE2EDuration="20.723740751s" podCreationTimestamp="2026-02-26 15:48:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:48:39.721001662 +0000 UTC m=+382.239563521" watchObservedRunningTime="2026-02-26 15:48:39.723740751 +0000 UTC m=+382.242302620" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.724294 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.738968 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.753041 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.772770 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-error\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.772839 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54457a79-b0ee-40f0-bf8e-b01268b8b391-audit-dir\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.772871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-login\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.772924 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-service-ca\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.772993 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct65v\" (UniqueName: \"kubernetes.io/projected/54457a79-b0ee-40f0-bf8e-b01268b8b391-kube-api-access-ct65v\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773124 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773144 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773163 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773184 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773207 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-router-certs\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-audit-policies\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773248 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-session\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.773293 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.789408 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.850420 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.851948 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.874990 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct65v\" (UniqueName: \"kubernetes.io/projected/54457a79-b0ee-40f0-bf8e-b01268b8b391-kube-api-access-ct65v\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875068 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875117 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875146 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875180 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-router-certs\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-audit-policies\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-session\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875359 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-error\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875461 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54457a79-b0ee-40f0-bf8e-b01268b8b391-audit-dir\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-login\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.875523 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-service-ca\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.876700 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-cliconfig\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.876815 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-service-ca\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.877189 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-audit-policies\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.878660 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/54457a79-b0ee-40f0-bf8e-b01268b8b391-audit-dir\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.882586 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.882724 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.884879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-session\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.885454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.887403 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-router-certs\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.887795 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-error\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.888702 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-user-template-login\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.895380 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.899401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/54457a79-b0ee-40f0-bf8e-b01268b8b391-v4-0-config-system-serving-cert\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.906650 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct65v\" (UniqueName: \"kubernetes.io/projected/54457a79-b0ee-40f0-bf8e-b01268b8b391-kube-api-access-ct65v\") pod \"oauth-openshift-59b95f96cf-255fr\" (UID: \"54457a79-b0ee-40f0-bf8e-b01268b8b391\") " pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:39 crc kubenswrapper[4907]: I0226 15:48:39.990241 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.012889 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.067215 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.085035 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.098960 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.120089 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.134399 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3ab95b-79df-45b9-9ada-c7c713e2e3e6" path="/var/lib/kubelet/pods/ca3ab95b-79df-45b9-9ada-c7c713e2e3e6/volumes" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.181579 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.282479 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.592977 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.640222 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.782624 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.795748 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.866573 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.905175 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.963687 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:48:40 crc kubenswrapper[4907]: I0226 15:48:40.963862 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.073719 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.097265 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.271749 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.301365 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.305161 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.319856 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.324486 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.326382 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.334147 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.339447 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.428547 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.434759 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.477095 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.558846 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.621915 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.629377 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.665733 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.672867 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 15:48:41 crc kubenswrapper[4907]: I0226 15:48:41.747454 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.012340 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.047140 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.064218 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.194569 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.350923 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.400681 4907 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.401197 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://b4f9fde9bfda905320bf8cf8897c11cf190969f123ff5644b5c3c62ce53613c4" gracePeriod=5 Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.451387 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.498833 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.500292 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.531223 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.536742 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.804206 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.805189 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.824759 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.855523 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 15:48:42 crc kubenswrapper[4907]: I0226 15:48:42.961024 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:42.999384 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-59b95f96cf-255fr"] Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.012307 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.159929 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.406171 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.418127 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.439784 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.467583 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.488925 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.523901 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.631802 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.666532 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.686194 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.713724 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.714573 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.915108 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-59b95f96cf-255fr_54457a79-b0ee-40f0-bf8e-b01268b8b391/oauth-openshift/0.log" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.915154 4907 generic.go:334] "Generic (PLEG): container finished" podID="54457a79-b0ee-40f0-bf8e-b01268b8b391" containerID="5c15392eca6621cbc0601b1d865ecd95422d71a331d5e9d78e3f5eb3c5898692" exitCode=255 Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.915184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" event={"ID":"54457a79-b0ee-40f0-bf8e-b01268b8b391","Type":"ContainerDied","Data":"5c15392eca6621cbc0601b1d865ecd95422d71a331d5e9d78e3f5eb3c5898692"} Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.915205 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" event={"ID":"54457a79-b0ee-40f0-bf8e-b01268b8b391","Type":"ContainerStarted","Data":"91daf97ccc8be99a1df4d14b0572eb31764225ffae4886e484e247cdeeb3b606"} Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.915581 4907 scope.go:117] "RemoveContainer" containerID="5c15392eca6621cbc0601b1d865ecd95422d71a331d5e9d78e3f5eb3c5898692" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.937972 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.942887 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 15:48:43 crc kubenswrapper[4907]: I0226 15:48:43.968246 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.049136 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.092503 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.096198 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.273007 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.292955 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.307556 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.308930 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.312123 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.345951 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.468743 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.492532 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.512193 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.518522 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.597598 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.689320 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.699513 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.746344 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.753676 4907 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.790772 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.922836 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-59b95f96cf-255fr_54457a79-b0ee-40f0-bf8e-b01268b8b391/oauth-openshift/0.log" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.922894 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" event={"ID":"54457a79-b0ee-40f0-bf8e-b01268b8b391","Type":"ContainerStarted","Data":"74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7"} Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.924611 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.931338 4907 patch_prober.go:28] interesting pod/oauth-openshift-59b95f96cf-255fr container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.67:6443/healthz\": read tcp 10.217.0.2:41338->10.217.0.67:6443: read: connection reset by peer" start-of-body= Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.931394 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" podUID="54457a79-b0ee-40f0-bf8e-b01268b8b391" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.67:6443/healthz\": read tcp 10.217.0.2:41338->10.217.0.67:6443: read: connection reset by peer" Feb 26 15:48:44 crc kubenswrapper[4907]: I0226 15:48:44.945215 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.006777 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.099780 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.201583 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.273051 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.336435 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.368809 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.575531 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.683901 4907 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.704177 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.712398 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.820985 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.879272 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.932788 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-59b95f96cf-255fr_54457a79-b0ee-40f0-bf8e-b01268b8b391/oauth-openshift/1.log" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.933416 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-59b95f96cf-255fr_54457a79-b0ee-40f0-bf8e-b01268b8b391/oauth-openshift/0.log" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.933482 4907 generic.go:334] "Generic (PLEG): container finished" podID="54457a79-b0ee-40f0-bf8e-b01268b8b391" containerID="74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7" exitCode=255 Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.933516 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" event={"ID":"54457a79-b0ee-40f0-bf8e-b01268b8b391","Type":"ContainerDied","Data":"74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7"} Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.933553 4907 scope.go:117] "RemoveContainer" containerID="5c15392eca6621cbc0601b1d865ecd95422d71a331d5e9d78e3f5eb3c5898692" Feb 26 15:48:45 crc kubenswrapper[4907]: I0226 15:48:45.935071 4907 scope.go:117] "RemoveContainer" containerID="74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7" Feb 26 15:48:45 crc kubenswrapper[4907]: E0226 15:48:45.935298 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-59b95f96cf-255fr_openshift-authentication(54457a79-b0ee-40f0-bf8e-b01268b8b391)\"" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" podUID="54457a79-b0ee-40f0-bf8e-b01268b8b391" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.096063 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.114584 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.285785 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.289620 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.592748 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.683371 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.753853 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.770118 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.831084 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.886745 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.940857 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-59b95f96cf-255fr_54457a79-b0ee-40f0-bf8e-b01268b8b391/oauth-openshift/1.log" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.941287 4907 scope.go:117] "RemoveContainer" containerID="74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7" Feb 26 15:48:46 crc kubenswrapper[4907]: E0226 15:48:46.941554 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-59b95f96cf-255fr_openshift-authentication(54457a79-b0ee-40f0-bf8e-b01268b8b391)\"" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" podUID="54457a79-b0ee-40f0-bf8e-b01268b8b391" Feb 26 15:48:46 crc kubenswrapper[4907]: I0226 15:48:46.985428 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.013578 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.198652 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.556234 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.736121 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.825214 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.869373 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.892030 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.915255 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.953777 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.954259 4907 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="b4f9fde9bfda905320bf8cf8897c11cf190969f123ff5644b5c3c62ce53613c4" exitCode=137 Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.954330 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e1d23ed512934c7f1b2a61abf70a987b2f6d092d482c52a0eadb2b8b80fd6e7" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.964280 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.988341 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 15:48:47 crc kubenswrapper[4907]: I0226 15:48:47.988451 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.100686 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.100803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.100848 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.100915 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.100935 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.100985 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101049 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101116 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101321 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101732 4907 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101770 4907 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101800 4907 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.101822 4907 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.108436 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.139043 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.139587 4907 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.161710 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.162064 4907 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e002109a-adf1-47ba-96cd-07643f01705e" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.163966 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.164135 4907 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="e002109a-adf1-47ba-96cd-07643f01705e" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.203206 4907 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.276605 4907 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.432271 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4"] Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.432627 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" podUID="4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" containerName="route-controller-manager" containerID="cri-o://1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2" gracePeriod=30 Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.444268 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585d7c4c78-7ksjg"] Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.444517 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" podUID="40555409-ee5f-45d8-9112-e3f5864d93aa" containerName="controller-manager" containerID="cri-o://6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061" gracePeriod=30 Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.818022 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.861938 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912036 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-proxy-ca-bundles\") pod \"40555409-ee5f-45d8-9112-e3f5864d93aa\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912093 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzhxw\" (UniqueName: \"kubernetes.io/projected/40555409-ee5f-45d8-9112-e3f5864d93aa-kube-api-access-gzhxw\") pod \"40555409-ee5f-45d8-9112-e3f5864d93aa\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912130 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-serving-cert\") pod \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912159 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-client-ca\") pod \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912186 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-config\") pod \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912239 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-client-ca\") pod \"40555409-ee5f-45d8-9112-e3f5864d93aa\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912265 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-config\") pod \"40555409-ee5f-45d8-9112-e3f5864d93aa\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912351 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxngl\" (UniqueName: \"kubernetes.io/projected/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-kube-api-access-wxngl\") pod \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\" (UID: \"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.912376 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40555409-ee5f-45d8-9112-e3f5864d93aa-serving-cert\") pod \"40555409-ee5f-45d8-9112-e3f5864d93aa\" (UID: \"40555409-ee5f-45d8-9112-e3f5864d93aa\") " Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.913780 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-client-ca" (OuterVolumeSpecName: "client-ca") pod "40555409-ee5f-45d8-9112-e3f5864d93aa" (UID: "40555409-ee5f-45d8-9112-e3f5864d93aa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.913798 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "40555409-ee5f-45d8-9112-e3f5864d93aa" (UID: "40555409-ee5f-45d8-9112-e3f5864d93aa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.914011 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-config" (OuterVolumeSpecName: "config") pod "40555409-ee5f-45d8-9112-e3f5864d93aa" (UID: "40555409-ee5f-45d8-9112-e3f5864d93aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.914354 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" (UID: "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.914361 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-config" (OuterVolumeSpecName: "config") pod "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" (UID: "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.921529 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" (UID: "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.921629 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40555409-ee5f-45d8-9112-e3f5864d93aa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "40555409-ee5f-45d8-9112-e3f5864d93aa" (UID: "40555409-ee5f-45d8-9112-e3f5864d93aa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.921709 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-kube-api-access-wxngl" (OuterVolumeSpecName: "kube-api-access-wxngl") pod "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" (UID: "4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f"). InnerVolumeSpecName "kube-api-access-wxngl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.921845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40555409-ee5f-45d8-9112-e3f5864d93aa-kube-api-access-gzhxw" (OuterVolumeSpecName: "kube-api-access-gzhxw") pod "40555409-ee5f-45d8-9112-e3f5864d93aa" (UID: "40555409-ee5f-45d8-9112-e3f5864d93aa"). InnerVolumeSpecName "kube-api-access-gzhxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.961441 4907 generic.go:334] "Generic (PLEG): container finished" podID="40555409-ee5f-45d8-9112-e3f5864d93aa" containerID="6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061" exitCode=0 Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.961537 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.963784 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" event={"ID":"40555409-ee5f-45d8-9112-e3f5864d93aa","Type":"ContainerDied","Data":"6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061"} Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.963852 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-585d7c4c78-7ksjg" event={"ID":"40555409-ee5f-45d8-9112-e3f5864d93aa","Type":"ContainerDied","Data":"4f4072c1845ec548e405023dc051c60b186e8263bee05480d7db43ad305020d1"} Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.963909 4907 scope.go:117] "RemoveContainer" containerID="6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.965951 4907 generic.go:334] "Generic (PLEG): container finished" podID="4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" containerID="1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2" exitCode=0 Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.966029 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.966449 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.968010 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" event={"ID":"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f","Type":"ContainerDied","Data":"1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2"} Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.968072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4" event={"ID":"4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f","Type":"ContainerDied","Data":"cfb9e9eb1007e7789a8decda98ff2dcc3dd2409595ca544782f20645f071893b"} Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.981295 4907 scope.go:117] "RemoveContainer" containerID="6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061" Feb 26 15:48:48 crc kubenswrapper[4907]: E0226 15:48:48.981569 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061\": container with ID starting with 6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061 not found: ID does not exist" containerID="6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.981645 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061"} err="failed to get container status \"6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061\": rpc error: code = NotFound desc = could not find container \"6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061\": container with ID starting with 6ef294d0b96ea5c7f7e05588c22a4d436257500922a75835f988b36629ff3061 not found: ID does not exist" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.981679 4907 scope.go:117] "RemoveContainer" containerID="1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2" Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.992967 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-585d7c4c78-7ksjg"] Feb 26 15:48:48 crc kubenswrapper[4907]: I0226 15:48:48.995138 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-585d7c4c78-7ksjg"] Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:48.999363 4907 scope.go:117] "RemoveContainer" containerID="1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2" Feb 26 15:48:49 crc kubenswrapper[4907]: E0226 15:48:48.999914 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2\": container with ID starting with 1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2 not found: ID does not exist" containerID="1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:48.999950 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2"} err="failed to get container status \"1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2\": rpc error: code = NotFound desc = could not find container \"1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2\": container with ID starting with 1404f09ec94b2f2378b1ead94a7d10200122f94d59736243001a35dffa2a07f2 not found: ID does not exist" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.001020 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4"] Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.006913 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f749d5666-wlmx4"] Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.013938 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxngl\" (UniqueName: \"kubernetes.io/projected/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-kube-api-access-wxngl\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.013957 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40555409-ee5f-45d8-9112-e3f5864d93aa-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.013966 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.013975 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzhxw\" (UniqueName: \"kubernetes.io/projected/40555409-ee5f-45d8-9112-e3f5864d93aa-kube-api-access-gzhxw\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.013985 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.013993 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.014003 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.014021 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.014029 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40555409-ee5f-45d8-9112-e3f5864d93aa-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.918642 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf"] Feb 26 15:48:49 crc kubenswrapper[4907]: E0226 15:48:49.919055 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40555409-ee5f-45d8-9112-e3f5864d93aa" containerName="controller-manager" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.919079 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="40555409-ee5f-45d8-9112-e3f5864d93aa" containerName="controller-manager" Feb 26 15:48:49 crc kubenswrapper[4907]: E0226 15:48:49.919104 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.919117 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 15:48:49 crc kubenswrapper[4907]: E0226 15:48:49.919149 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" containerName="route-controller-manager" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.919167 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" containerName="route-controller-manager" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.919407 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" containerName="route-controller-manager" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.919442 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="40555409-ee5f-45d8-9112-e3f5864d93aa" containerName="controller-manager" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.919463 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.920112 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.924717 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.926328 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.926469 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25"] Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.927563 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.928179 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.929066 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.931913 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.931932 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.932679 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.932784 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.932938 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.932954 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.933268 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.937495 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.941864 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.947855 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25"] Feb 26 15:48:49 crc kubenswrapper[4907]: I0226 15:48:49.950708 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf"] Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.014839 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.015497 4907 scope.go:117] "RemoveContainer" containerID="74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7" Feb 26 15:48:50 crc kubenswrapper[4907]: E0226 15:48:50.015735 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oauth-openshift\" with CrashLoopBackOff: \"back-off 10s restarting failed container=oauth-openshift pod=oauth-openshift-59b95f96cf-255fr_openshift-authentication(54457a79-b0ee-40f0-bf8e-b01268b8b391)\"" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" podUID="54457a79-b0ee-40f0-bf8e-b01268b8b391" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.028803 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-config\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.029163 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m69r\" (UniqueName: \"kubernetes.io/projected/a02ac037-379e-41f4-8a41-8844838e3ed1-kube-api-access-4m69r\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.029331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8psvp\" (UniqueName: \"kubernetes.io/projected/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-kube-api-access-8psvp\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.029926 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-config\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.030056 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-serving-cert\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.030163 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02ac037-379e-41f4-8a41-8844838e3ed1-serving-cert\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.030279 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-client-ca\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.030401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-proxy-ca-bundles\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.030525 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-client-ca\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-config\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m69r\" (UniqueName: \"kubernetes.io/projected/a02ac037-379e-41f4-8a41-8844838e3ed1-kube-api-access-4m69r\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131581 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8psvp\" (UniqueName: \"kubernetes.io/projected/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-kube-api-access-8psvp\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-config\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131655 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-serving-cert\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131692 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02ac037-379e-41f4-8a41-8844838e3ed1-serving-cert\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-client-ca\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131796 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-proxy-ca-bundles\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.131842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-client-ca\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.132904 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-config\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.134027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-client-ca\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.134125 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-client-ca\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.134414 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-proxy-ca-bundles\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.134874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-config\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.135958 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40555409-ee5f-45d8-9112-e3f5864d93aa" path="/var/lib/kubelet/pods/40555409-ee5f-45d8-9112-e3f5864d93aa/volumes" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.136610 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f" path="/var/lib/kubelet/pods/4c0a8c3f-a6d1-48fc-bd2f-5f1e771f708f/volumes" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.140630 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-serving-cert\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.151608 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02ac037-379e-41f4-8a41-8844838e3ed1-serving-cert\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.160395 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8psvp\" (UniqueName: \"kubernetes.io/projected/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-kube-api-access-8psvp\") pod \"controller-manager-f8f88d7c7-ph9cf\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.165415 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m69r\" (UniqueName: \"kubernetes.io/projected/a02ac037-379e-41f4-8a41-8844838e3ed1-kube-api-access-4m69r\") pod \"route-controller-manager-64dd5fdbcf-5lh25\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.258052 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.268170 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.344161 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.458518 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf"] Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.520697 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25"] Feb 26 15:48:50 crc kubenswrapper[4907]: W0226 15:48:50.528053 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda02ac037_379e_41f4_8a41_8844838e3ed1.slice/crio-b897d87ffd834dd87fa3e4044267647603d96df32b21e10545d986274983fa1e WatchSource:0}: Error finding container b897d87ffd834dd87fa3e4044267647603d96df32b21e10545d986274983fa1e: Status 404 returned error can't find the container with id b897d87ffd834dd87fa3e4044267647603d96df32b21e10545d986274983fa1e Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.986038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" event={"ID":"a02ac037-379e-41f4-8a41-8844838e3ed1","Type":"ContainerStarted","Data":"13435db368c64d96cf18b3e5359954f09a70ec3e9f189340b2a6dcdfb875d4e3"} Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.986097 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" event={"ID":"a02ac037-379e-41f4-8a41-8844838e3ed1","Type":"ContainerStarted","Data":"b897d87ffd834dd87fa3e4044267647603d96df32b21e10545d986274983fa1e"} Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.986114 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.988339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" event={"ID":"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc","Type":"ContainerStarted","Data":"4ebed317d27199734564e1793f61ca41451666f285685435f615f4f2c755d683"} Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.988367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" event={"ID":"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc","Type":"ContainerStarted","Data":"cd15693d48ea20dd9e85e317bd42009e2362e2ceeeeb8947e6bac39060de60eb"} Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.988644 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:50 crc kubenswrapper[4907]: I0226 15:48:50.993692 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:48:51 crc kubenswrapper[4907]: I0226 15:48:51.020253 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:48:51 crc kubenswrapper[4907]: I0226 15:48:51.031149 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" podStartSLOduration=3.03112845 podStartE2EDuration="3.03112845s" podCreationTimestamp="2026-02-26 15:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:48:51.012828029 +0000 UTC m=+393.531389908" watchObservedRunningTime="2026-02-26 15:48:51.03112845 +0000 UTC m=+393.549690299" Feb 26 15:48:51 crc kubenswrapper[4907]: I0226 15:48:51.034401 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" podStartSLOduration=3.034388182 podStartE2EDuration="3.034388182s" podCreationTimestamp="2026-02-26 15:48:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:48:51.030709089 +0000 UTC m=+393.549270958" watchObservedRunningTime="2026-02-26 15:48:51.034388182 +0000 UTC m=+393.552950031" Feb 26 15:49:01 crc kubenswrapper[4907]: I0226 15:49:01.126989 4907 scope.go:117] "RemoveContainer" containerID="74bb38a0f3ab068190b35028ccf090110f052cc457932bf179ccaffee7d6a3c7" Feb 26 15:49:02 crc kubenswrapper[4907]: I0226 15:49:02.061018 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-59b95f96cf-255fr_54457a79-b0ee-40f0-bf8e-b01268b8b391/oauth-openshift/1.log" Feb 26 15:49:02 crc kubenswrapper[4907]: I0226 15:49:02.061447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" event={"ID":"54457a79-b0ee-40f0-bf8e-b01268b8b391","Type":"ContainerStarted","Data":"094f372c2570dde0e7ab35dd30cf6d60d05323fc6aedae56fd7bf24988941162"} Feb 26 15:49:02 crc kubenswrapper[4907]: I0226 15:49:02.062036 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:49:02 crc kubenswrapper[4907]: I0226 15:49:02.070801 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" Feb 26 15:49:02 crc kubenswrapper[4907]: I0226 15:49:02.089917 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-59b95f96cf-255fr" podStartSLOduration=62.08984262 podStartE2EDuration="1m2.08984262s" podCreationTimestamp="2026-02-26 15:48:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:48:44.959788179 +0000 UTC m=+387.478350068" watchObservedRunningTime="2026-02-26 15:49:02.08984262 +0000 UTC m=+404.608404519" Feb 26 15:49:08 crc kubenswrapper[4907]: I0226 15:49:08.384701 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf"] Feb 26 15:49:08 crc kubenswrapper[4907]: I0226 15:49:08.386114 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" podUID="cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" containerName="controller-manager" containerID="cri-o://4ebed317d27199734564e1793f61ca41451666f285685435f615f4f2c755d683" gracePeriod=30 Feb 26 15:49:08 crc kubenswrapper[4907]: I0226 15:49:08.423696 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25"] Feb 26 15:49:08 crc kubenswrapper[4907]: I0226 15:49:08.423905 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" podUID="a02ac037-379e-41f4-8a41-8844838e3ed1" containerName="route-controller-manager" containerID="cri-o://13435db368c64d96cf18b3e5359954f09a70ec3e9f189340b2a6dcdfb875d4e3" gracePeriod=30 Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.117260 4907 generic.go:334] "Generic (PLEG): container finished" podID="a02ac037-379e-41f4-8a41-8844838e3ed1" containerID="13435db368c64d96cf18b3e5359954f09a70ec3e9f189340b2a6dcdfb875d4e3" exitCode=0 Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.117349 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" event={"ID":"a02ac037-379e-41f4-8a41-8844838e3ed1","Type":"ContainerDied","Data":"13435db368c64d96cf18b3e5359954f09a70ec3e9f189340b2a6dcdfb875d4e3"} Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.119976 4907 generic.go:334] "Generic (PLEG): container finished" podID="cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" containerID="4ebed317d27199734564e1793f61ca41451666f285685435f615f4f2c755d683" exitCode=0 Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.120013 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" event={"ID":"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc","Type":"ContainerDied","Data":"4ebed317d27199734564e1793f61ca41451666f285685435f615f4f2c755d683"} Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.541312 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.567921 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q"] Feb 26 15:49:09 crc kubenswrapper[4907]: E0226 15:49:09.568189 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02ac037-379e-41f4-8a41-8844838e3ed1" containerName="route-controller-manager" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.568204 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02ac037-379e-41f4-8a41-8844838e3ed1" containerName="route-controller-manager" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.568310 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02ac037-379e-41f4-8a41-8844838e3ed1" containerName="route-controller-manager" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.568754 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-client-ca\") pod \"a02ac037-379e-41f4-8a41-8844838e3ed1\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580221 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m69r\" (UniqueName: \"kubernetes.io/projected/a02ac037-379e-41f4-8a41-8844838e3ed1-kube-api-access-4m69r\") pod \"a02ac037-379e-41f4-8a41-8844838e3ed1\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580292 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02ac037-379e-41f4-8a41-8844838e3ed1-serving-cert\") pod \"a02ac037-379e-41f4-8a41-8844838e3ed1\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580325 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-config\") pod \"a02ac037-379e-41f4-8a41-8844838e3ed1\" (UID: \"a02ac037-379e-41f4-8a41-8844838e3ed1\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580457 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-serving-cert\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580494 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-client-ca\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-config\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.580895 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-client-ca" (OuterVolumeSpecName: "client-ca") pod "a02ac037-379e-41f4-8a41-8844838e3ed1" (UID: "a02ac037-379e-41f4-8a41-8844838e3ed1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.581032 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-config" (OuterVolumeSpecName: "config") pod "a02ac037-379e-41f4-8a41-8844838e3ed1" (UID: "a02ac037-379e-41f4-8a41-8844838e3ed1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.582173 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nc72\" (UniqueName: \"kubernetes.io/projected/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-kube-api-access-6nc72\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.582248 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.582264 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a02ac037-379e-41f4-8a41-8844838e3ed1-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.586811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02ac037-379e-41f4-8a41-8844838e3ed1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a02ac037-379e-41f4-8a41-8844838e3ed1" (UID: "a02ac037-379e-41f4-8a41-8844838e3ed1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.591196 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02ac037-379e-41f4-8a41-8844838e3ed1-kube-api-access-4m69r" (OuterVolumeSpecName: "kube-api-access-4m69r") pod "a02ac037-379e-41f4-8a41-8844838e3ed1" (UID: "a02ac037-379e-41f4-8a41-8844838e3ed1"). InnerVolumeSpecName "kube-api-access-4m69r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.595123 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q"] Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.642652 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683071 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-serving-cert\") pod \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-config\") pod \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8psvp\" (UniqueName: \"kubernetes.io/projected/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-kube-api-access-8psvp\") pod \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683221 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-proxy-ca-bundles\") pod \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683237 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-client-ca\") pod \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\" (UID: \"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc\") " Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-config\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683734 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nc72\" (UniqueName: \"kubernetes.io/projected/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-kube-api-access-6nc72\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683763 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-serving-cert\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683837 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-client-ca\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683894 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m69r\" (UniqueName: \"kubernetes.io/projected/a02ac037-379e-41f4-8a41-8844838e3ed1-kube-api-access-4m69r\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.683923 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a02ac037-379e-41f4-8a41-8844838e3ed1-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.684209 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-config" (OuterVolumeSpecName: "config") pod "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" (UID: "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.684822 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-client-ca\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.685141 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-config\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.686286 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" (UID: "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.686866 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-kube-api-access-8psvp" (OuterVolumeSpecName: "kube-api-access-8psvp") pod "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" (UID: "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc"). InnerVolumeSpecName "kube-api-access-8psvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.686939 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" (UID: "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.687260 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" (UID: "cd4df426-ba25-4767-8dc0-33a8a8fb2fcc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.689260 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-serving-cert\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.700245 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nc72\" (UniqueName: \"kubernetes.io/projected/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-kube-api-access-6nc72\") pod \"route-controller-manager-cd59f947b-pjl5q\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.785145 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.785184 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8psvp\" (UniqueName: \"kubernetes.io/projected/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-kube-api-access-8psvp\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.785198 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.785208 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.785220 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:09 crc kubenswrapper[4907]: I0226 15:49:09.940149 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.135480 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.137876 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25" event={"ID":"a02ac037-379e-41f4-8a41-8844838e3ed1","Type":"ContainerDied","Data":"b897d87ffd834dd87fa3e4044267647603d96df32b21e10545d986274983fa1e"} Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.137928 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q"] Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.137957 4907 scope.go:117] "RemoveContainer" containerID="13435db368c64d96cf18b3e5359954f09a70ec3e9f189340b2a6dcdfb875d4e3" Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.139037 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" event={"ID":"cd4df426-ba25-4767-8dc0-33a8a8fb2fcc","Type":"ContainerDied","Data":"cd15693d48ea20dd9e85e317bd42009e2362e2ceeeeb8947e6bac39060de60eb"} Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.139107 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf" Feb 26 15:49:10 crc kubenswrapper[4907]: W0226 15:49:10.152066 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa37ec9d_a846_4315_91cb_58af7a2f9cbe.slice/crio-aab7d4ab04262aae1c283c95f4b37162d324ca2c41aeef1f6f149c57eac02ed1 WatchSource:0}: Error finding container aab7d4ab04262aae1c283c95f4b37162d324ca2c41aeef1f6f149c57eac02ed1: Status 404 returned error can't find the container with id aab7d4ab04262aae1c283c95f4b37162d324ca2c41aeef1f6f149c57eac02ed1 Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.162787 4907 scope.go:117] "RemoveContainer" containerID="4ebed317d27199734564e1793f61ca41451666f285685435f615f4f2c755d683" Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.185492 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf"] Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.198925 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-ph9cf"] Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.210313 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25"] Feb 26 15:49:10 crc kubenswrapper[4907]: I0226 15:49:10.214514 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-5lh25"] Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.147251 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" event={"ID":"aa37ec9d-a846-4315-91cb-58af7a2f9cbe","Type":"ContainerStarted","Data":"4ec1cba8db3899b60cac09d675cd653e23eb7a64c507794a354f97c76f844b7c"} Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.147525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" event={"ID":"aa37ec9d-a846-4315-91cb-58af7a2f9cbe","Type":"ContainerStarted","Data":"aab7d4ab04262aae1c283c95f4b37162d324ca2c41aeef1f6f149c57eac02ed1"} Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.148684 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.154380 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.162020 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" podStartSLOduration=3.162006071 podStartE2EDuration="3.162006071s" podCreationTimestamp="2026-02-26 15:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:49:11.160218327 +0000 UTC m=+413.678780186" watchObservedRunningTime="2026-02-26 15:49:11.162006071 +0000 UTC m=+413.680567930" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.927967 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-8cx2k"] Feb 26 15:49:11 crc kubenswrapper[4907]: E0226 15:49:11.928224 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" containerName="controller-manager" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.928240 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" containerName="controller-manager" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.928359 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" containerName="controller-manager" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.928852 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.932466 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.932746 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.936844 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.937107 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.938666 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.941784 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.945040 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:49:11 crc kubenswrapper[4907]: I0226 15:49:11.951549 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-8cx2k"] Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.118070 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-client-ca\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.118157 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-proxy-ca-bundles\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.118217 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqwn\" (UniqueName: \"kubernetes.io/projected/f34ac8f5-7487-4011-83d7-975b26c7a363-kube-api-access-fwqwn\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.118281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34ac8f5-7487-4011-83d7-975b26c7a363-serving-cert\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.118320 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-config\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.134106 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02ac037-379e-41f4-8a41-8844838e3ed1" path="/var/lib/kubelet/pods/a02ac037-379e-41f4-8a41-8844838e3ed1/volumes" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.134614 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd4df426-ba25-4767-8dc0-33a8a8fb2fcc" path="/var/lib/kubelet/pods/cd4df426-ba25-4767-8dc0-33a8a8fb2fcc/volumes" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.220145 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqwn\" (UniqueName: \"kubernetes.io/projected/f34ac8f5-7487-4011-83d7-975b26c7a363-kube-api-access-fwqwn\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.220217 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34ac8f5-7487-4011-83d7-975b26c7a363-serving-cert\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.220241 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-config\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.220344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-client-ca\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.220367 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-proxy-ca-bundles\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.221944 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-proxy-ca-bundles\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.222011 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-client-ca\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.223145 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-config\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.229683 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34ac8f5-7487-4011-83d7-975b26c7a363-serving-cert\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.238574 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqwn\" (UniqueName: \"kubernetes.io/projected/f34ac8f5-7487-4011-83d7-975b26c7a363-kube-api-access-fwqwn\") pod \"controller-manager-5f55469458-8cx2k\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.262515 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:12 crc kubenswrapper[4907]: I0226 15:49:12.784993 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-8cx2k"] Feb 26 15:49:13 crc kubenswrapper[4907]: I0226 15:49:13.190232 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" event={"ID":"f34ac8f5-7487-4011-83d7-975b26c7a363","Type":"ContainerStarted","Data":"e28d9d10a276db04b1dd4bdee9e07b9933a84daf302d5529fb0e004d2eac1de1"} Feb 26 15:49:13 crc kubenswrapper[4907]: I0226 15:49:13.190486 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" event={"ID":"f34ac8f5-7487-4011-83d7-975b26c7a363","Type":"ContainerStarted","Data":"649c2ecb996ac1474778666f5afccb8eb957378e30e942a4051ffa1f14904c54"} Feb 26 15:49:13 crc kubenswrapper[4907]: I0226 15:49:13.190720 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:13 crc kubenswrapper[4907]: I0226 15:49:13.217485 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" podStartSLOduration=5.217469387 podStartE2EDuration="5.217469387s" podCreationTimestamp="2026-02-26 15:49:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:49:13.214563394 +0000 UTC m=+415.733125243" watchObservedRunningTime="2026-02-26 15:49:13.217469387 +0000 UTC m=+415.736031236" Feb 26 15:49:13 crc kubenswrapper[4907]: I0226 15:49:13.224049 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:49:48 crc kubenswrapper[4907]: I0226 15:49:48.530113 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:49:48 crc kubenswrapper[4907]: I0226 15:49:48.530726 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.307374 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v8kx"] Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.308153 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2v8kx" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="registry-server" containerID="cri-o://98b5dc93a44069ead28dd6acf75ccfb89db095f25683cbe5525ec8594c89d9f7" gracePeriod=2 Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.478244 4907 generic.go:334] "Generic (PLEG): container finished" podID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerID="98b5dc93a44069ead28dd6acf75ccfb89db095f25683cbe5525ec8594c89d9f7" exitCode=0 Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.478308 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerDied","Data":"98b5dc93a44069ead28dd6acf75ccfb89db095f25683cbe5525ec8594c89d9f7"} Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.764606 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.843567 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-utilities\") pod \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.843651 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqtzd\" (UniqueName: \"kubernetes.io/projected/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-kube-api-access-rqtzd\") pod \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.843704 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-catalog-content\") pod \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\" (UID: \"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2\") " Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.846053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-utilities" (OuterVolumeSpecName: "utilities") pod "763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" (UID: "763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.850901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-kube-api-access-rqtzd" (OuterVolumeSpecName: "kube-api-access-rqtzd") pod "763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" (UID: "763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2"). InnerVolumeSpecName "kube-api-access-rqtzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.903554 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" (UID: "763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.946540 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.946573 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqtzd\" (UniqueName: \"kubernetes.io/projected/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-kube-api-access-rqtzd\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:56 crc kubenswrapper[4907]: I0226 15:49:56.946584 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.487910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v8kx" event={"ID":"763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2","Type":"ContainerDied","Data":"9a14fdd18c8fdf23b4449d75fda4a6de3abba541668a78d14925046e7c3a39a7"} Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.488682 4907 scope.go:117] "RemoveContainer" containerID="98b5dc93a44069ead28dd6acf75ccfb89db095f25683cbe5525ec8594c89d9f7" Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.488856 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v8kx" Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.528982 4907 scope.go:117] "RemoveContainer" containerID="820d19843e9f8b3dde2255ec1f62105710e7b3e79ad020bae31022182cfd8324" Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.554268 4907 scope.go:117] "RemoveContainer" containerID="819c58d2ad5fa6ba3642ee404f839db9d4e3eb5e6adb6439fa9d1995cac97649" Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.555617 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v8kx"] Feb 26 15:49:57 crc kubenswrapper[4907]: I0226 15:49:57.559844 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2v8kx"] Feb 26 15:49:58 crc kubenswrapper[4907]: I0226 15:49:58.140107 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" path="/var/lib/kubelet/pods/763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2/volumes" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.145005 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535350-924dl"] Feb 26 15:50:00 crc kubenswrapper[4907]: E0226 15:50:00.145468 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="registry-server" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.145480 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="registry-server" Feb 26 15:50:00 crc kubenswrapper[4907]: E0226 15:50:00.145489 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="extract-content" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.145495 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="extract-content" Feb 26 15:50:00 crc kubenswrapper[4907]: E0226 15:50:00.145512 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="extract-utilities" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.145518 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="extract-utilities" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.145626 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="763dfaad-6b70-4ea8-a5ba-b4729dd1dcf2" containerName="registry-server" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.145946 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.149123 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.149197 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.150177 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.158482 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-924dl"] Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.190495 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52vt\" (UniqueName: \"kubernetes.io/projected/20256617-55f3-4228-8200-bd57793ff553-kube-api-access-m52vt\") pod \"auto-csr-approver-29535350-924dl\" (UID: \"20256617-55f3-4228-8200-bd57793ff553\") " pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.292202 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m52vt\" (UniqueName: \"kubernetes.io/projected/20256617-55f3-4228-8200-bd57793ff553-kube-api-access-m52vt\") pod \"auto-csr-approver-29535350-924dl\" (UID: \"20256617-55f3-4228-8200-bd57793ff553\") " pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.318285 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52vt\" (UniqueName: \"kubernetes.io/projected/20256617-55f3-4228-8200-bd57793ff553-kube-api-access-m52vt\") pod \"auto-csr-approver-29535350-924dl\" (UID: \"20256617-55f3-4228-8200-bd57793ff553\") " pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.460412 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:00 crc kubenswrapper[4907]: I0226 15:50:00.941329 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-924dl"] Feb 26 15:50:01 crc kubenswrapper[4907]: I0226 15:50:01.513693 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-924dl" event={"ID":"20256617-55f3-4228-8200-bd57793ff553","Type":"ContainerStarted","Data":"10ef4c0bc8244c713cd7b4c2df74fe63ffe71d1c8f4602e4afc927f80e86cef3"} Feb 26 15:50:02 crc kubenswrapper[4907]: I0226 15:50:02.520651 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-924dl" event={"ID":"20256617-55f3-4228-8200-bd57793ff553","Type":"ContainerStarted","Data":"cf3f3c1e48222cd514f503b65cc470d33e7c3a75035ab2d9d3af36adfe22dc4a"} Feb 26 15:50:03 crc kubenswrapper[4907]: I0226 15:50:03.527732 4907 generic.go:334] "Generic (PLEG): container finished" podID="20256617-55f3-4228-8200-bd57793ff553" containerID="cf3f3c1e48222cd514f503b65cc470d33e7c3a75035ab2d9d3af36adfe22dc4a" exitCode=0 Feb 26 15:50:03 crc kubenswrapper[4907]: I0226 15:50:03.527781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-924dl" event={"ID":"20256617-55f3-4228-8200-bd57793ff553","Type":"ContainerDied","Data":"cf3f3c1e48222cd514f503b65cc470d33e7c3a75035ab2d9d3af36adfe22dc4a"} Feb 26 15:50:04 crc kubenswrapper[4907]: I0226 15:50:04.821977 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:04 crc kubenswrapper[4907]: I0226 15:50:04.948284 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m52vt\" (UniqueName: \"kubernetes.io/projected/20256617-55f3-4228-8200-bd57793ff553-kube-api-access-m52vt\") pod \"20256617-55f3-4228-8200-bd57793ff553\" (UID: \"20256617-55f3-4228-8200-bd57793ff553\") " Feb 26 15:50:04 crc kubenswrapper[4907]: I0226 15:50:04.954881 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20256617-55f3-4228-8200-bd57793ff553-kube-api-access-m52vt" (OuterVolumeSpecName: "kube-api-access-m52vt") pod "20256617-55f3-4228-8200-bd57793ff553" (UID: "20256617-55f3-4228-8200-bd57793ff553"). InnerVolumeSpecName "kube-api-access-m52vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:05 crc kubenswrapper[4907]: I0226 15:50:05.049677 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m52vt\" (UniqueName: \"kubernetes.io/projected/20256617-55f3-4228-8200-bd57793ff553-kube-api-access-m52vt\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:05 crc kubenswrapper[4907]: I0226 15:50:05.541496 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535350-924dl" event={"ID":"20256617-55f3-4228-8200-bd57793ff553","Type":"ContainerDied","Data":"10ef4c0bc8244c713cd7b4c2df74fe63ffe71d1c8f4602e4afc927f80e86cef3"} Feb 26 15:50:05 crc kubenswrapper[4907]: I0226 15:50:05.541532 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10ef4c0bc8244c713cd7b4c2df74fe63ffe71d1c8f4602e4afc927f80e86cef3" Feb 26 15:50:05 crc kubenswrapper[4907]: I0226 15:50:05.541556 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535350-924dl" Feb 26 15:50:05 crc kubenswrapper[4907]: I0226 15:50:05.598352 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-fsndq"] Feb 26 15:50:05 crc kubenswrapper[4907]: I0226 15:50:05.601231 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535344-fsndq"] Feb 26 15:50:06 crc kubenswrapper[4907]: I0226 15:50:06.133939 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b0532e1-9350-435d-bb1f-72bb0931a2e8" path="/var/lib/kubelet/pods/1b0532e1-9350-435d-bb1f-72bb0931a2e8/volumes" Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.392997 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-8cx2k"] Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.393683 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" podUID="f34ac8f5-7487-4011-83d7-975b26c7a363" containerName="controller-manager" containerID="cri-o://e28d9d10a276db04b1dd4bdee9e07b9933a84daf302d5529fb0e004d2eac1de1" gracePeriod=30 Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.427470 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q"] Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.432072 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" podUID="aa37ec9d-a846-4315-91cb-58af7a2f9cbe" containerName="route-controller-manager" containerID="cri-o://4ec1cba8db3899b60cac09d675cd653e23eb7a64c507794a354f97c76f844b7c" gracePeriod=30 Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.560705 4907 generic.go:334] "Generic (PLEG): container finished" podID="f34ac8f5-7487-4011-83d7-975b26c7a363" containerID="e28d9d10a276db04b1dd4bdee9e07b9933a84daf302d5529fb0e004d2eac1de1" exitCode=0 Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.560779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" event={"ID":"f34ac8f5-7487-4011-83d7-975b26c7a363","Type":"ContainerDied","Data":"e28d9d10a276db04b1dd4bdee9e07b9933a84daf302d5529fb0e004d2eac1de1"} Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.569134 4907 generic.go:334] "Generic (PLEG): container finished" podID="aa37ec9d-a846-4315-91cb-58af7a2f9cbe" containerID="4ec1cba8db3899b60cac09d675cd653e23eb7a64c507794a354f97c76f844b7c" exitCode=0 Feb 26 15:50:08 crc kubenswrapper[4907]: I0226 15:50:08.569193 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" event={"ID":"aa37ec9d-a846-4315-91cb-58af7a2f9cbe","Type":"ContainerDied","Data":"4ec1cba8db3899b60cac09d675cd653e23eb7a64c507794a354f97c76f844b7c"} Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:08.858659 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:08.864875 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.002873 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34ac8f5-7487-4011-83d7-975b26c7a363-serving-cert\") pod \"f34ac8f5-7487-4011-83d7-975b26c7a363\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.002954 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-config\") pod \"f34ac8f5-7487-4011-83d7-975b26c7a363\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nc72\" (UniqueName: \"kubernetes.io/projected/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-kube-api-access-6nc72\") pod \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003072 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-serving-cert\") pod \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003122 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-client-ca\") pod \"f34ac8f5-7487-4011-83d7-975b26c7a363\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003183 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwqwn\" (UniqueName: \"kubernetes.io/projected/f34ac8f5-7487-4011-83d7-975b26c7a363-kube-api-access-fwqwn\") pod \"f34ac8f5-7487-4011-83d7-975b26c7a363\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003265 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-config\") pod \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003374 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-proxy-ca-bundles\") pod \"f34ac8f5-7487-4011-83d7-975b26c7a363\" (UID: \"f34ac8f5-7487-4011-83d7-975b26c7a363\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.003440 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-client-ca\") pod \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\" (UID: \"aa37ec9d-a846-4315-91cb-58af7a2f9cbe\") " Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.004001 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-client-ca" (OuterVolumeSpecName: "client-ca") pod "f34ac8f5-7487-4011-83d7-975b26c7a363" (UID: "f34ac8f5-7487-4011-83d7-975b26c7a363"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.004028 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-config" (OuterVolumeSpecName: "config") pod "aa37ec9d-a846-4315-91cb-58af7a2f9cbe" (UID: "aa37ec9d-a846-4315-91cb-58af7a2f9cbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.004475 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f34ac8f5-7487-4011-83d7-975b26c7a363" (UID: "f34ac8f5-7487-4011-83d7-975b26c7a363"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.004585 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-client-ca" (OuterVolumeSpecName: "client-ca") pod "aa37ec9d-a846-4315-91cb-58af7a2f9cbe" (UID: "aa37ec9d-a846-4315-91cb-58af7a2f9cbe"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.004990 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-config" (OuterVolumeSpecName: "config") pod "f34ac8f5-7487-4011-83d7-975b26c7a363" (UID: "f34ac8f5-7487-4011-83d7-975b26c7a363"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.007830 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-kube-api-access-6nc72" (OuterVolumeSpecName: "kube-api-access-6nc72") pod "aa37ec9d-a846-4315-91cb-58af7a2f9cbe" (UID: "aa37ec9d-a846-4315-91cb-58af7a2f9cbe"). InnerVolumeSpecName "kube-api-access-6nc72". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.008006 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aa37ec9d-a846-4315-91cb-58af7a2f9cbe" (UID: "aa37ec9d-a846-4315-91cb-58af7a2f9cbe"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.008179 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f34ac8f5-7487-4011-83d7-975b26c7a363-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f34ac8f5-7487-4011-83d7-975b26c7a363" (UID: "f34ac8f5-7487-4011-83d7-975b26c7a363"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.009727 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34ac8f5-7487-4011-83d7-975b26c7a363-kube-api-access-fwqwn" (OuterVolumeSpecName: "kube-api-access-fwqwn") pod "f34ac8f5-7487-4011-83d7-975b26c7a363" (UID: "f34ac8f5-7487-4011-83d7-975b26c7a363"). InnerVolumeSpecName "kube-api-access-fwqwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.104959 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f34ac8f5-7487-4011-83d7-975b26c7a363-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105006 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105027 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nc72\" (UniqueName: \"kubernetes.io/projected/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-kube-api-access-6nc72\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105048 4907 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105067 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105084 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwqwn\" (UniqueName: \"kubernetes.io/projected/f34ac8f5-7487-4011-83d7-975b26c7a363-kube-api-access-fwqwn\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105100 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105116 4907 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f34ac8f5-7487-4011-83d7-975b26c7a363-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.105134 4907 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aa37ec9d-a846-4315-91cb-58af7a2f9cbe-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.576944 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" event={"ID":"aa37ec9d-a846-4315-91cb-58af7a2f9cbe","Type":"ContainerDied","Data":"aab7d4ab04262aae1c283c95f4b37162d324ca2c41aeef1f6f149c57eac02ed1"} Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.576990 4907 scope.go:117] "RemoveContainer" containerID="4ec1cba8db3899b60cac09d675cd653e23eb7a64c507794a354f97c76f844b7c" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.577121 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.588044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" event={"ID":"f34ac8f5-7487-4011-83d7-975b26c7a363","Type":"ContainerDied","Data":"649c2ecb996ac1474778666f5afccb8eb957378e30e942a4051ffa1f14904c54"} Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.588107 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5f55469458-8cx2k" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.601417 4907 scope.go:117] "RemoveContainer" containerID="e28d9d10a276db04b1dd4bdee9e07b9933a84daf302d5529fb0e004d2eac1de1" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.626791 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.635972 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cd59f947b-pjl5q"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.640612 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-8cx2k"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.648861 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5f55469458-8cx2k"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.972683 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv"] Feb 26 15:50:09 crc kubenswrapper[4907]: E0226 15:50:09.972956 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20256617-55f3-4228-8200-bd57793ff553" containerName="oc" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.972974 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="20256617-55f3-4228-8200-bd57793ff553" containerName="oc" Feb 26 15:50:09 crc kubenswrapper[4907]: E0226 15:50:09.972992 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34ac8f5-7487-4011-83d7-975b26c7a363" containerName="controller-manager" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.973012 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34ac8f5-7487-4011-83d7-975b26c7a363" containerName="controller-manager" Feb 26 15:50:09 crc kubenswrapper[4907]: E0226 15:50:09.973025 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa37ec9d-a846-4315-91cb-58af7a2f9cbe" containerName="route-controller-manager" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.973033 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa37ec9d-a846-4315-91cb-58af7a2f9cbe" containerName="route-controller-manager" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.973146 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34ac8f5-7487-4011-83d7-975b26c7a363" containerName="controller-manager" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.973159 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa37ec9d-a846-4315-91cb-58af7a2f9cbe" containerName="route-controller-manager" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.973168 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="20256617-55f3-4228-8200-bd57793ff553" containerName="oc" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.973571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.977884 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.978540 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.978893 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.979106 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.980093 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.980131 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.980102 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.981048 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.985402 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.985628 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.986076 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.986394 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.986649 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.986891 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.988803 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.994389 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b"] Feb 26 15:50:09 crc kubenswrapper[4907]: I0226 15:50:09.999146 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.118922 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46dac98-b4eb-4e47-949a-2a35a7e966ac-serving-cert\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119306 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-proxy-ca-bundles\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr8xx\" (UniqueName: \"kubernetes.io/projected/0899f62f-f38e-4f0d-98a3-1e9411790b31-kube-api-access-sr8xx\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119397 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46dac98-b4eb-4e47-949a-2a35a7e966ac-client-ca\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119425 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-config\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119451 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq7hd\" (UniqueName: \"kubernetes.io/projected/d46dac98-b4eb-4e47-949a-2a35a7e966ac-kube-api-access-fq7hd\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119479 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46dac98-b4eb-4e47-949a-2a35a7e966ac-config\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119515 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-client-ca\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.119534 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0899f62f-f38e-4f0d-98a3-1e9411790b31-serving-cert\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.138216 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa37ec9d-a846-4315-91cb-58af7a2f9cbe" path="/var/lib/kubelet/pods/aa37ec9d-a846-4315-91cb-58af7a2f9cbe/volumes" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.139334 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34ac8f5-7487-4011-83d7-975b26c7a363" path="/var/lib/kubelet/pods/f34ac8f5-7487-4011-83d7-975b26c7a363/volumes" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220423 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46dac98-b4eb-4e47-949a-2a35a7e966ac-client-ca\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220480 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-config\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220507 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq7hd\" (UniqueName: \"kubernetes.io/projected/d46dac98-b4eb-4e47-949a-2a35a7e966ac-kube-api-access-fq7hd\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220538 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46dac98-b4eb-4e47-949a-2a35a7e966ac-config\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-client-ca\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220610 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0899f62f-f38e-4f0d-98a3-1e9411790b31-serving-cert\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220645 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46dac98-b4eb-4e47-949a-2a35a7e966ac-serving-cert\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220668 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-proxy-ca-bundles\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.220692 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr8xx\" (UniqueName: \"kubernetes.io/projected/0899f62f-f38e-4f0d-98a3-1e9411790b31-kube-api-access-sr8xx\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.222188 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-client-ca\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.223094 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-config\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.223566 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d46dac98-b4eb-4e47-949a-2a35a7e966ac-client-ca\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.223644 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d46dac98-b4eb-4e47-949a-2a35a7e966ac-config\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.224423 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0899f62f-f38e-4f0d-98a3-1e9411790b31-proxy-ca-bundles\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.226508 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0899f62f-f38e-4f0d-98a3-1e9411790b31-serving-cert\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.227872 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d46dac98-b4eb-4e47-949a-2a35a7e966ac-serving-cert\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.246235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr8xx\" (UniqueName: \"kubernetes.io/projected/0899f62f-f38e-4f0d-98a3-1e9411790b31-kube-api-access-sr8xx\") pod \"controller-manager-f8f88d7c7-mj5sv\" (UID: \"0899f62f-f38e-4f0d-98a3-1e9411790b31\") " pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.249340 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq7hd\" (UniqueName: \"kubernetes.io/projected/d46dac98-b4eb-4e47-949a-2a35a7e966ac-kube-api-access-fq7hd\") pod \"route-controller-manager-64dd5fdbcf-drd6b\" (UID: \"d46dac98-b4eb-4e47-949a-2a35a7e966ac\") " pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.313196 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.335980 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.559124 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv"] Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.601653 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" event={"ID":"0899f62f-f38e-4f0d-98a3-1e9411790b31","Type":"ContainerStarted","Data":"3df84a1dc549e9933f8b95d47741fd0097f4b78c977bb8645edb9b2a31265794"} Feb 26 15:50:10 crc kubenswrapper[4907]: I0226 15:50:10.801531 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b"] Feb 26 15:50:10 crc kubenswrapper[4907]: W0226 15:50:10.804604 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46dac98_b4eb_4e47_949a_2a35a7e966ac.slice/crio-1726480f2bba8e45ab125d9022b7503ca7731ba908484de73dd247aae7e735f9 WatchSource:0}: Error finding container 1726480f2bba8e45ab125d9022b7503ca7731ba908484de73dd247aae7e735f9: Status 404 returned error can't find the container with id 1726480f2bba8e45ab125d9022b7503ca7731ba908484de73dd247aae7e735f9 Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.619207 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" event={"ID":"0899f62f-f38e-4f0d-98a3-1e9411790b31","Type":"ContainerStarted","Data":"2d15eb9d881e6d6abbf99f71eebd0f7665c93114229ae36d524933583d5c58bd"} Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.619516 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.620753 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" event={"ID":"d46dac98-b4eb-4e47-949a-2a35a7e966ac","Type":"ContainerStarted","Data":"354f4de51ec808643b1835807765646d807c18f96f5a5d4f39c13c261da683da"} Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.620791 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" event={"ID":"d46dac98-b4eb-4e47-949a-2a35a7e966ac","Type":"ContainerStarted","Data":"1726480f2bba8e45ab125d9022b7503ca7731ba908484de73dd247aae7e735f9"} Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.621230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.626220 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.627418 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.639967 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f8f88d7c7-mj5sv" podStartSLOduration=3.639949945 podStartE2EDuration="3.639949945s" podCreationTimestamp="2026-02-26 15:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:50:11.63694919 +0000 UTC m=+474.155511039" watchObservedRunningTime="2026-02-26 15:50:11.639949945 +0000 UTC m=+474.158511814" Feb 26 15:50:11 crc kubenswrapper[4907]: I0226 15:50:11.657662 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-64dd5fdbcf-drd6b" podStartSLOduration=3.6576402679999998 podStartE2EDuration="3.657640268s" podCreationTimestamp="2026-02-26 15:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:50:11.65331636 +0000 UTC m=+474.171878229" watchObservedRunningTime="2026-02-26 15:50:11.657640268 +0000 UTC m=+474.176202127" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.780401 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mh988"] Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.781340 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.845354 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mh988"] Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859194 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68fd85a1-4435-48b9-9673-6cc6242a7d44-trusted-ca\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-registry-tls\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859272 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68fd85a1-4435-48b9-9673-6cc6242a7d44-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859296 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-bound-sa-token\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68fd85a1-4435-48b9-9673-6cc6242a7d44-registry-certificates\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859361 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68fd85a1-4435-48b9-9673-6cc6242a7d44-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.859389 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmjc2\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-kube-api-access-fmjc2\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.879428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.960948 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68fd85a1-4435-48b9-9673-6cc6242a7d44-trusted-ca\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.960996 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-registry-tls\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.961016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68fd85a1-4435-48b9-9673-6cc6242a7d44-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.961046 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-bound-sa-token\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.961083 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68fd85a1-4435-48b9-9673-6cc6242a7d44-registry-certificates\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.961107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68fd85a1-4435-48b9-9673-6cc6242a7d44-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.961135 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmjc2\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-kube-api-access-fmjc2\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.961801 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/68fd85a1-4435-48b9-9673-6cc6242a7d44-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.962241 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68fd85a1-4435-48b9-9673-6cc6242a7d44-trusted-ca\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.962619 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/68fd85a1-4435-48b9-9673-6cc6242a7d44-registry-certificates\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.967279 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/68fd85a1-4435-48b9-9673-6cc6242a7d44-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.973845 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-registry-tls\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.974809 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmjc2\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-kube-api-access-fmjc2\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:12 crc kubenswrapper[4907]: I0226 15:50:12.979728 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68fd85a1-4435-48b9-9673-6cc6242a7d44-bound-sa-token\") pod \"image-registry-66df7c8f76-mh988\" (UID: \"68fd85a1-4435-48b9-9673-6cc6242a7d44\") " pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:13 crc kubenswrapper[4907]: I0226 15:50:13.101285 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:13 crc kubenswrapper[4907]: I0226 15:50:13.531823 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mh988"] Feb 26 15:50:13 crc kubenswrapper[4907]: I0226 15:50:13.633922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" event={"ID":"68fd85a1-4435-48b9-9673-6cc6242a7d44","Type":"ContainerStarted","Data":"c15160f10237896aeb06eaad0473f360ad52a9e6d52ef85244ce45b3735ebdc4"} Feb 26 15:50:14 crc kubenswrapper[4907]: I0226 15:50:14.642738 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" event={"ID":"68fd85a1-4435-48b9-9673-6cc6242a7d44","Type":"ContainerStarted","Data":"59c92d9d864cf1876cacedb01533a87613a3aa7ead8383cfda103b9857cb1745"} Feb 26 15:50:14 crc kubenswrapper[4907]: I0226 15:50:14.642923 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:14 crc kubenswrapper[4907]: I0226 15:50:14.677054 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" podStartSLOduration=2.67703347 podStartE2EDuration="2.67703347s" podCreationTimestamp="2026-02-26 15:50:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:50:14.674485116 +0000 UTC m=+477.193046995" watchObservedRunningTime="2026-02-26 15:50:14.67703347 +0000 UTC m=+477.195595329" Feb 26 15:50:18 crc kubenswrapper[4907]: I0226 15:50:18.530240 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:50:18 crc kubenswrapper[4907]: I0226 15:50:18.530651 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:50:33 crc kubenswrapper[4907]: I0226 15:50:33.108012 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mh988" Feb 26 15:50:33 crc kubenswrapper[4907]: I0226 15:50:33.190696 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kqtml"] Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.530448 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.530978 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.531028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.531566 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b5fce09e6f67f86221daea08fdd5259aaa4024d9dbe5e7a76056c4c092f3ec2"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.531701 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://5b5fce09e6f67f86221daea08fdd5259aaa4024d9dbe5e7a76056c4c092f3ec2" gracePeriod=600 Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.858094 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="5b5fce09e6f67f86221daea08fdd5259aaa4024d9dbe5e7a76056c4c092f3ec2" exitCode=0 Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.858826 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"5b5fce09e6f67f86221daea08fdd5259aaa4024d9dbe5e7a76056c4c092f3ec2"} Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.858882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"53be863c74815dd43aa6d07eb234f8fc9300124de620faba3fc31d92226518b6"} Feb 26 15:50:48 crc kubenswrapper[4907]: I0226 15:50:48.858902 4907 scope.go:117] "RemoveContainer" containerID="178aa71969c1efffd1f234213afe3cf84ffc1f8300112efb368309603695c3ee" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.627380 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tqxjz"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.627973 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tqxjz" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="registry-server" containerID="cri-o://f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66" gracePeriod=30 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.641070 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22zr8"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.641418 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-22zr8" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="registry-server" containerID="cri-o://c72c74a6fe179f86c2339265699491607cd58e186d909b6af9e06f9ddcbd3100" gracePeriod=30 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.652958 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dvcn5"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.653212 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" containerName="marketplace-operator" containerID="cri-o://696e6ee06370721d0f2fc0767f48826636ddd1416581a40026f5923f1f382ca8" gracePeriod=30 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.667575 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcwbm"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.667948 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fcwbm" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="registry-server" containerID="cri-o://27fc85274312d14655440bdd5823fceaeff047f1528a4370fcb212cab5f45070" gracePeriod=30 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.678500 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68qpc"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.678789 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-68qpc" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="registry-server" containerID="cri-o://ff495918e96a3698db9a9a8dd4dd7887a76c7f0afe3392521131c07da299b110" gracePeriod=30 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.696233 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svjkc"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.697002 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.708984 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svjkc"] Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.880981 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0e96b15-45f7-47f1-878e-57914ef18916" containerID="f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66" exitCode=0 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.881036 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqxjz" event={"ID":"e0e96b15-45f7-47f1-878e-57914ef18916","Type":"ContainerDied","Data":"f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66"} Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.881110 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6z2\" (UniqueName: \"kubernetes.io/projected/77a34fa8-40ba-4944-bd27-03a9a4f7761f-kube-api-access-cm6z2\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.881172 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77a34fa8-40ba-4944-bd27-03a9a4f7761f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.881196 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77a34fa8-40ba-4944-bd27-03a9a4f7761f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.884028 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c70b66e-978a-4c7e-9892-5579869aa740" containerID="27fc85274312d14655440bdd5823fceaeff047f1528a4370fcb212cab5f45070" exitCode=0 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.884071 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerDied","Data":"27fc85274312d14655440bdd5823fceaeff047f1528a4370fcb212cab5f45070"} Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.889373 4907 generic.go:334] "Generic (PLEG): container finished" podID="23df369e-238f-4fbc-99fa-b22c21011db0" containerID="696e6ee06370721d0f2fc0767f48826636ddd1416581a40026f5923f1f382ca8" exitCode=0 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.889412 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" event={"ID":"23df369e-238f-4fbc-99fa-b22c21011db0","Type":"ContainerDied","Data":"696e6ee06370721d0f2fc0767f48826636ddd1416581a40026f5923f1f382ca8"} Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.892812 4907 generic.go:334] "Generic (PLEG): container finished" podID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerID="c72c74a6fe179f86c2339265699491607cd58e186d909b6af9e06f9ddcbd3100" exitCode=0 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.892891 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22zr8" event={"ID":"4d3f9fc7-85b9-4095-af0d-7993e681ab2a","Type":"ContainerDied","Data":"c72c74a6fe179f86c2339265699491607cd58e186d909b6af9e06f9ddcbd3100"} Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.895517 4907 generic.go:334] "Generic (PLEG): container finished" podID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerID="ff495918e96a3698db9a9a8dd4dd7887a76c7f0afe3392521131c07da299b110" exitCode=0 Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.895544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerDied","Data":"ff495918e96a3698db9a9a8dd4dd7887a76c7f0afe3392521131c07da299b110"} Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.981851 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77a34fa8-40ba-4944-bd27-03a9a4f7761f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.982204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77a34fa8-40ba-4944-bd27-03a9a4f7761f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.982299 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6z2\" (UniqueName: \"kubernetes.io/projected/77a34fa8-40ba-4944-bd27-03a9a4f7761f-kube-api-access-cm6z2\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.983458 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77a34fa8-40ba-4944-bd27-03a9a4f7761f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.988965 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77a34fa8-40ba-4944-bd27-03a9a4f7761f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:49 crc kubenswrapper[4907]: I0226 15:50:49.999852 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6z2\" (UniqueName: \"kubernetes.io/projected/77a34fa8-40ba-4944-bd27-03a9a4f7761f-kube-api-access-cm6z2\") pod \"marketplace-operator-79b997595-svjkc\" (UID: \"77a34fa8-40ba-4944-bd27-03a9a4f7761f\") " pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:50 crc kubenswrapper[4907]: E0226 15:50:50.005609 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66 is running failed: container process not found" containerID="f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 15:50:50 crc kubenswrapper[4907]: E0226 15:50:50.011856 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66 is running failed: container process not found" containerID="f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 15:50:50 crc kubenswrapper[4907]: E0226 15:50:50.012611 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66 is running failed: container process not found" containerID="f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 15:50:50 crc kubenswrapper[4907]: E0226 15:50:50.012674 4907 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-tqxjz" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="registry-server" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.057104 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.075800 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.183514 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-catalog-content\") pod \"e0e96b15-45f7-47f1-878e-57914ef18916\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.183859 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-utilities\") pod \"e0e96b15-45f7-47f1-878e-57914ef18916\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.183883 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmcgc\" (UniqueName: \"kubernetes.io/projected/e0e96b15-45f7-47f1-878e-57914ef18916-kube-api-access-xmcgc\") pod \"e0e96b15-45f7-47f1-878e-57914ef18916\" (UID: \"e0e96b15-45f7-47f1-878e-57914ef18916\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.185166 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-utilities" (OuterVolumeSpecName: "utilities") pod "e0e96b15-45f7-47f1-878e-57914ef18916" (UID: "e0e96b15-45f7-47f1-878e-57914ef18916"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.189385 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.211301 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e96b15-45f7-47f1-878e-57914ef18916-kube-api-access-xmcgc" (OuterVolumeSpecName: "kube-api-access-xmcgc") pod "e0e96b15-45f7-47f1-878e-57914ef18916" (UID: "e0e96b15-45f7-47f1-878e-57914ef18916"). InnerVolumeSpecName "kube-api-access-xmcgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.222995 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.228269 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.274104 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.277937 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e0e96b15-45f7-47f1-878e-57914ef18916" (UID: "e0e96b15-45f7-47f1-878e-57914ef18916"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.287784 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nzhg\" (UniqueName: \"kubernetes.io/projected/d6b454c4-bdcd-4904-8564-84c414871c6d-kube-api-access-7nzhg\") pod \"d6b454c4-bdcd-4904-8564-84c414871c6d\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.290936 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-utilities\") pod \"d6b454c4-bdcd-4904-8564-84c414871c6d\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.290968 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-catalog-content\") pod \"d6b454c4-bdcd-4904-8564-84c414871c6d\" (UID: \"d6b454c4-bdcd-4904-8564-84c414871c6d\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.291219 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.291232 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmcgc\" (UniqueName: \"kubernetes.io/projected/e0e96b15-45f7-47f1-878e-57914ef18916-kube-api-access-xmcgc\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.291242 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0e96b15-45f7-47f1-878e-57914ef18916-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.294493 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-utilities" (OuterVolumeSpecName: "utilities") pod "d6b454c4-bdcd-4904-8564-84c414871c6d" (UID: "d6b454c4-bdcd-4904-8564-84c414871c6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.299857 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6b454c4-bdcd-4904-8564-84c414871c6d-kube-api-access-7nzhg" (OuterVolumeSpecName: "kube-api-access-7nzhg") pod "d6b454c4-bdcd-4904-8564-84c414871c6d" (UID: "d6b454c4-bdcd-4904-8564-84c414871c6d"). InnerVolumeSpecName "kube-api-access-7nzhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392253 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-utilities\") pod \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392306 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-trusted-ca\") pod \"23df369e-238f-4fbc-99fa-b22c21011db0\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392330 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-catalog-content\") pod \"6c70b66e-978a-4c7e-9892-5579869aa740\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392362 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-catalog-content\") pod \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-utilities\") pod \"6c70b66e-978a-4c7e-9892-5579869aa740\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392443 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkv2t\" (UniqueName: \"kubernetes.io/projected/6c70b66e-978a-4c7e-9892-5579869aa740-kube-api-access-bkv2t\") pod \"6c70b66e-978a-4c7e-9892-5579869aa740\" (UID: \"6c70b66e-978a-4c7e-9892-5579869aa740\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392505 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6kj7\" (UniqueName: \"kubernetes.io/projected/23df369e-238f-4fbc-99fa-b22c21011db0-kube-api-access-g6kj7\") pod \"23df369e-238f-4fbc-99fa-b22c21011db0\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392549 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-operator-metrics\") pod \"23df369e-238f-4fbc-99fa-b22c21011db0\" (UID: \"23df369e-238f-4fbc-99fa-b22c21011db0\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392624 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pwjf\" (UniqueName: \"kubernetes.io/projected/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-kube-api-access-8pwjf\") pod \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\" (UID: \"4d3f9fc7-85b9-4095-af0d-7993e681ab2a\") " Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392895 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nzhg\" (UniqueName: \"kubernetes.io/projected/d6b454c4-bdcd-4904-8564-84c414871c6d-kube-api-access-7nzhg\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.392924 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.393789 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "23df369e-238f-4fbc-99fa-b22c21011db0" (UID: "23df369e-238f-4fbc-99fa-b22c21011db0"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.394139 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-utilities" (OuterVolumeSpecName: "utilities") pod "6c70b66e-978a-4c7e-9892-5579869aa740" (UID: "6c70b66e-978a-4c7e-9892-5579869aa740"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.394399 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-utilities" (OuterVolumeSpecName: "utilities") pod "4d3f9fc7-85b9-4095-af0d-7993e681ab2a" (UID: "4d3f9fc7-85b9-4095-af0d-7993e681ab2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.396165 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-kube-api-access-8pwjf" (OuterVolumeSpecName: "kube-api-access-8pwjf") pod "4d3f9fc7-85b9-4095-af0d-7993e681ab2a" (UID: "4d3f9fc7-85b9-4095-af0d-7993e681ab2a"). InnerVolumeSpecName "kube-api-access-8pwjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.398059 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23df369e-238f-4fbc-99fa-b22c21011db0-kube-api-access-g6kj7" (OuterVolumeSpecName: "kube-api-access-g6kj7") pod "23df369e-238f-4fbc-99fa-b22c21011db0" (UID: "23df369e-238f-4fbc-99fa-b22c21011db0"). InnerVolumeSpecName "kube-api-access-g6kj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.398085 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "23df369e-238f-4fbc-99fa-b22c21011db0" (UID: "23df369e-238f-4fbc-99fa-b22c21011db0"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.398147 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c70b66e-978a-4c7e-9892-5579869aa740-kube-api-access-bkv2t" (OuterVolumeSpecName: "kube-api-access-bkv2t") pod "6c70b66e-978a-4c7e-9892-5579869aa740" (UID: "6c70b66e-978a-4c7e-9892-5579869aa740"). InnerVolumeSpecName "kube-api-access-bkv2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.439639 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c70b66e-978a-4c7e-9892-5579869aa740" (UID: "6c70b66e-978a-4c7e-9892-5579869aa740"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.444127 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6b454c4-bdcd-4904-8564-84c414871c6d" (UID: "d6b454c4-bdcd-4904-8564-84c414871c6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.446228 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4d3f9fc7-85b9-4095-af0d-7993e681ab2a" (UID: "4d3f9fc7-85b9-4095-af0d-7993e681ab2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493896 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493925 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkv2t\" (UniqueName: \"kubernetes.io/projected/6c70b66e-978a-4c7e-9892-5579869aa740-kube-api-access-bkv2t\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493937 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6b454c4-bdcd-4904-8564-84c414871c6d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493946 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6kj7\" (UniqueName: \"kubernetes.io/projected/23df369e-238f-4fbc-99fa-b22c21011db0-kube-api-access-g6kj7\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493970 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493980 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pwjf\" (UniqueName: \"kubernetes.io/projected/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-kube-api-access-8pwjf\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493988 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.493996 4907 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23df369e-238f-4fbc-99fa-b22c21011db0-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.494004 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c70b66e-978a-4c7e-9892-5579869aa740-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.494012 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4d3f9fc7-85b9-4095-af0d-7993e681ab2a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.614636 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svjkc"] Feb 26 15:50:50 crc kubenswrapper[4907]: W0226 15:50:50.619866 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77a34fa8_40ba_4944_bd27_03a9a4f7761f.slice/crio-785434087adcf04b362a1ef9408f06d9dcca9eb7b2ea46c8fe6748ff8d1a8c1d WatchSource:0}: Error finding container 785434087adcf04b362a1ef9408f06d9dcca9eb7b2ea46c8fe6748ff8d1a8c1d: Status 404 returned error can't find the container with id 785434087adcf04b362a1ef9408f06d9dcca9eb7b2ea46c8fe6748ff8d1a8c1d Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.915073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" event={"ID":"77a34fa8-40ba-4944-bd27-03a9a4f7761f","Type":"ContainerStarted","Data":"d4bcb6114565a983195c09c7b8c0aced9f663d975f7bf2f95fa0269ba791da04"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.915385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" event={"ID":"77a34fa8-40ba-4944-bd27-03a9a4f7761f","Type":"ContainerStarted","Data":"785434087adcf04b362a1ef9408f06d9dcca9eb7b2ea46c8fe6748ff8d1a8c1d"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.916525 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.918508 4907 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svjkc container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" start-of-body= Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.918550 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" podUID="77a34fa8-40ba-4944-bd27-03a9a4f7761f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.76:8080/healthz\": dial tcp 10.217.0.76:8080: connect: connection refused" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.919023 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.919194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dvcn5" event={"ID":"23df369e-238f-4fbc-99fa-b22c21011db0","Type":"ContainerDied","Data":"463fce778766ab780cd80023770cbb1f0ce53f29756763e00f5d14a8a833939e"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.919232 4907 scope.go:117] "RemoveContainer" containerID="696e6ee06370721d0f2fc0767f48826636ddd1416581a40026f5923f1f382ca8" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.923884 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-22zr8" event={"ID":"4d3f9fc7-85b9-4095-af0d-7993e681ab2a","Type":"ContainerDied","Data":"fcfcc54b656f7ed5e451008138c11019c142d336aeba7be471de266a08620106"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.923996 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-22zr8" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.929485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-68qpc" event={"ID":"d6b454c4-bdcd-4904-8564-84c414871c6d","Type":"ContainerDied","Data":"5cbe46269cbd05823103c8a0dc8d00b43048842aa45a9e8580a1f5c4a8c568dc"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.929634 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-68qpc" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.934938 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" podStartSLOduration=1.934923577 podStartE2EDuration="1.934923577s" podCreationTimestamp="2026-02-26 15:50:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:50:50.934677721 +0000 UTC m=+513.453239570" watchObservedRunningTime="2026-02-26 15:50:50.934923577 +0000 UTC m=+513.453485426" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.937971 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tqxjz" event={"ID":"e0e96b15-45f7-47f1-878e-57914ef18916","Type":"ContainerDied","Data":"f2b308fc94ead912b6e64ba7c506bcc5ba9109de65514b1841ba893f7ccf2ca5"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.938034 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tqxjz" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.939190 4907 scope.go:117] "RemoveContainer" containerID="c72c74a6fe179f86c2339265699491607cd58e186d909b6af9e06f9ddcbd3100" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.939855 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fcwbm" event={"ID":"6c70b66e-978a-4c7e-9892-5579869aa740","Type":"ContainerDied","Data":"266f70a4b4e3e3430a1da51f67bd8fb99828c7ab0716557a0762d399b723bf7d"} Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.939996 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fcwbm" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.964282 4907 scope.go:117] "RemoveContainer" containerID="337eab21d91536771f9db3b8bc9e6c75eb59aa9d86381d97d7e4d96004617014" Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.993803 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tqxjz"] Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.997204 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tqxjz"] Feb 26 15:50:50 crc kubenswrapper[4907]: I0226 15:50:50.999537 4907 scope.go:117] "RemoveContainer" containerID="c2c08520c50b5de1170decee9bdf0e675f941d01781f08e93d921a3eca83bc15" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.027677 4907 scope.go:117] "RemoveContainer" containerID="ff495918e96a3698db9a9a8dd4dd7887a76c7f0afe3392521131c07da299b110" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.035349 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-68qpc"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.040140 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-68qpc"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.049107 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-22zr8"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.059799 4907 scope.go:117] "RemoveContainer" containerID="5b6bba62015d7f1e8bca64181979b1590f0fdcc51bc221dc9e17782f8f30c36e" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.059912 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-22zr8"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.062199 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcwbm"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.065190 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fcwbm"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.070406 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dvcn5"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.070456 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dvcn5"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.078789 4907 scope.go:117] "RemoveContainer" containerID="ea695100592dfd1eadf4890e219236bb7912653b40207b5e0dff1b4377913f3c" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.092658 4907 scope.go:117] "RemoveContainer" containerID="f014a26fb915e7edcbb3f7cb78c727a2b21301773c46b1bd95b32e8ed2744a66" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.112326 4907 scope.go:117] "RemoveContainer" containerID="b0eccf1b45b5e24d81664d8f91f70b8dbe57b62bf009ff26d5fce1594fe459fc" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.126741 4907 scope.go:117] "RemoveContainer" containerID="37508190b8d35d7607acf4d938f773e568560ddcd0367749779bcc9bc0dd24b1" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.147077 4907 scope.go:117] "RemoveContainer" containerID="27fc85274312d14655440bdd5823fceaeff047f1528a4370fcb212cab5f45070" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.159297 4907 scope.go:117] "RemoveContainer" containerID="0f48aaaabc782b274056eec753def33f5ba9dbd594bd7b6f158793c163222e37" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.171847 4907 scope.go:117] "RemoveContainer" containerID="a497956b577958b8fef18a1420d2d621f7ae083a86cdbc6716f46357b1608777" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.839960 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vcgkv"] Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840452 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840468 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840475 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840482 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840496 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" containerName="marketplace-operator" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840503 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" containerName="marketplace-operator" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840511 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840518 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840528 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840534 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840544 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840552 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840559 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840564 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840574 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840580 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840602 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840609 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840616 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840623 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="extract-content" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840631 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840637 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840646 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840652 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="extract-utilities" Feb 26 15:50:51 crc kubenswrapper[4907]: E0226 15:50:51.840659 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840665 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840752 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840764 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840772 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" containerName="marketplace-operator" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840781 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.840788 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" containerName="registry-server" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.841439 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.844068 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.855076 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vcgkv"] Feb 26 15:50:51 crc kubenswrapper[4907]: I0226 15:50:51.951162 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-svjkc" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.018474 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012fe452-e0b5-4248-a110-8bf778e9595d-catalog-content\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.018634 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012fe452-e0b5-4248-a110-8bf778e9595d-utilities\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.018680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drc9n\" (UniqueName: \"kubernetes.io/projected/012fe452-e0b5-4248-a110-8bf778e9595d-kube-api-access-drc9n\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.045967 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mr9bt"] Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.048951 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr9bt"] Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.048940 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.052062 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.120244 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012fe452-e0b5-4248-a110-8bf778e9595d-utilities\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.120331 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drc9n\" (UniqueName: \"kubernetes.io/projected/012fe452-e0b5-4248-a110-8bf778e9595d-kube-api-access-drc9n\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.120439 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012fe452-e0b5-4248-a110-8bf778e9595d-catalog-content\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.121376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/012fe452-e0b5-4248-a110-8bf778e9595d-catalog-content\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.121456 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/012fe452-e0b5-4248-a110-8bf778e9595d-utilities\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.133571 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23df369e-238f-4fbc-99fa-b22c21011db0" path="/var/lib/kubelet/pods/23df369e-238f-4fbc-99fa-b22c21011db0/volumes" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.134165 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3f9fc7-85b9-4095-af0d-7993e681ab2a" path="/var/lib/kubelet/pods/4d3f9fc7-85b9-4095-af0d-7993e681ab2a/volumes" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.134894 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c70b66e-978a-4c7e-9892-5579869aa740" path="/var/lib/kubelet/pods/6c70b66e-978a-4c7e-9892-5579869aa740/volumes" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.136063 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6b454c4-bdcd-4904-8564-84c414871c6d" path="/var/lib/kubelet/pods/d6b454c4-bdcd-4904-8564-84c414871c6d/volumes" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.136837 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e96b15-45f7-47f1-878e-57914ef18916" path="/var/lib/kubelet/pods/e0e96b15-45f7-47f1-878e-57914ef18916/volumes" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.137090 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drc9n\" (UniqueName: \"kubernetes.io/projected/012fe452-e0b5-4248-a110-8bf778e9595d-kube-api-access-drc9n\") pod \"redhat-marketplace-vcgkv\" (UID: \"012fe452-e0b5-4248-a110-8bf778e9595d\") " pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.169838 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.221559 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b123c0-d91d-4bed-8fd2-7931cbca4acb-utilities\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.221648 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8zxs\" (UniqueName: \"kubernetes.io/projected/01b123c0-d91d-4bed-8fd2-7931cbca4acb-kube-api-access-z8zxs\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.221702 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b123c0-d91d-4bed-8fd2-7931cbca4acb-catalog-content\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.323281 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b123c0-d91d-4bed-8fd2-7931cbca4acb-utilities\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.323799 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8zxs\" (UniqueName: \"kubernetes.io/projected/01b123c0-d91d-4bed-8fd2-7931cbca4acb-kube-api-access-z8zxs\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.323830 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b123c0-d91d-4bed-8fd2-7931cbca4acb-catalog-content\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.323917 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b123c0-d91d-4bed-8fd2-7931cbca4acb-utilities\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.324156 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b123c0-d91d-4bed-8fd2-7931cbca4acb-catalog-content\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.354699 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8zxs\" (UniqueName: \"kubernetes.io/projected/01b123c0-d91d-4bed-8fd2-7931cbca4acb-kube-api-access-z8zxs\") pod \"redhat-operators-mr9bt\" (UID: \"01b123c0-d91d-4bed-8fd2-7931cbca4acb\") " pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.377780 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.447406 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vcgkv"] Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.801147 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mr9bt"] Feb 26 15:50:52 crc kubenswrapper[4907]: W0226 15:50:52.803731 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b123c0_d91d_4bed_8fd2_7931cbca4acb.slice/crio-18b8230bf98b12d4333df4c7bebad122f48467ba8f9ca074ae6a8fb7af336ab3 WatchSource:0}: Error finding container 18b8230bf98b12d4333df4c7bebad122f48467ba8f9ca074ae6a8fb7af336ab3: Status 404 returned error can't find the container with id 18b8230bf98b12d4333df4c7bebad122f48467ba8f9ca074ae6a8fb7af336ab3 Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.959329 4907 generic.go:334] "Generic (PLEG): container finished" podID="01b123c0-d91d-4bed-8fd2-7931cbca4acb" containerID="812da3c9cee47cf79a68fae6112fc68c03242f40011b0ee61f6f750d6a681b59" exitCode=0 Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.959495 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9bt" event={"ID":"01b123c0-d91d-4bed-8fd2-7931cbca4acb","Type":"ContainerDied","Data":"812da3c9cee47cf79a68fae6112fc68c03242f40011b0ee61f6f750d6a681b59"} Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.959536 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9bt" event={"ID":"01b123c0-d91d-4bed-8fd2-7931cbca4acb","Type":"ContainerStarted","Data":"18b8230bf98b12d4333df4c7bebad122f48467ba8f9ca074ae6a8fb7af336ab3"} Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.960795 4907 generic.go:334] "Generic (PLEG): container finished" podID="012fe452-e0b5-4248-a110-8bf778e9595d" containerID="1138d39dacf54f317c6a8c1fff06090607f9ac5cc49e1657ea1ce1f65e15623d" exitCode=0 Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.960858 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vcgkv" event={"ID":"012fe452-e0b5-4248-a110-8bf778e9595d","Type":"ContainerDied","Data":"1138d39dacf54f317c6a8c1fff06090607f9ac5cc49e1657ea1ce1f65e15623d"} Feb 26 15:50:52 crc kubenswrapper[4907]: I0226 15:50:52.960921 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vcgkv" event={"ID":"012fe452-e0b5-4248-a110-8bf778e9595d","Type":"ContainerStarted","Data":"add87eda723a99f745a2f21f6b49c11247e67a548565bc44b3143ea0457d3091"} Feb 26 15:50:53 crc kubenswrapper[4907]: I0226 15:50:53.966505 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9bt" event={"ID":"01b123c0-d91d-4bed-8fd2-7931cbca4acb","Type":"ContainerStarted","Data":"5fabfddc0abae85e5ef3784a370d90ca5d4b37c7311426548969c5002abfd639"} Feb 26 15:50:53 crc kubenswrapper[4907]: I0226 15:50:53.968753 4907 generic.go:334] "Generic (PLEG): container finished" podID="012fe452-e0b5-4248-a110-8bf778e9595d" containerID="b544b487d56f656aed23c7542f44e99fe6443f79675f516a213afca08d9d3501" exitCode=0 Feb 26 15:50:53 crc kubenswrapper[4907]: I0226 15:50:53.969191 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vcgkv" event={"ID":"012fe452-e0b5-4248-a110-8bf778e9595d","Type":"ContainerDied","Data":"b544b487d56f656aed23c7542f44e99fe6443f79675f516a213afca08d9d3501"} Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.242297 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xbpfd"] Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.243761 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.247958 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.259797 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xbpfd"] Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.353782 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5d4\" (UniqueName: \"kubernetes.io/projected/12fc0143-8c96-4837-99ce-f5b7e447f10b-kube-api-access-ql5d4\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.353835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12fc0143-8c96-4837-99ce-f5b7e447f10b-catalog-content\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.353895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12fc0143-8c96-4837-99ce-f5b7e447f10b-utilities\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.455850 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5d4\" (UniqueName: \"kubernetes.io/projected/12fc0143-8c96-4837-99ce-f5b7e447f10b-kube-api-access-ql5d4\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.456252 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12fc0143-8c96-4837-99ce-f5b7e447f10b-catalog-content\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.456284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12fc0143-8c96-4837-99ce-f5b7e447f10b-utilities\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.456882 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12fc0143-8c96-4837-99ce-f5b7e447f10b-utilities\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.456959 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12fc0143-8c96-4837-99ce-f5b7e447f10b-catalog-content\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.462380 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xttzz"] Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.464033 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.466725 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xttzz"] Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.467315 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.486978 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5d4\" (UniqueName: \"kubernetes.io/projected/12fc0143-8c96-4837-99ce-f5b7e447f10b-kube-api-access-ql5d4\") pod \"community-operators-xbpfd\" (UID: \"12fc0143-8c96-4837-99ce-f5b7e447f10b\") " pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.560239 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.659551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9rc\" (UniqueName: \"kubernetes.io/projected/df2526da-5738-4040-afe3-6019b50203ae-kube-api-access-6g9rc\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.659597 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2526da-5738-4040-afe3-6019b50203ae-catalog-content\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.659623 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2526da-5738-4040-afe3-6019b50203ae-utilities\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.760473 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9rc\" (UniqueName: \"kubernetes.io/projected/df2526da-5738-4040-afe3-6019b50203ae-kube-api-access-6g9rc\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.761435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2526da-5738-4040-afe3-6019b50203ae-catalog-content\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.761475 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2526da-5738-4040-afe3-6019b50203ae-utilities\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.761941 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df2526da-5738-4040-afe3-6019b50203ae-utilities\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.762225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df2526da-5738-4040-afe3-6019b50203ae-catalog-content\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.780308 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9rc\" (UniqueName: \"kubernetes.io/projected/df2526da-5738-4040-afe3-6019b50203ae-kube-api-access-6g9rc\") pod \"certified-operators-xttzz\" (UID: \"df2526da-5738-4040-afe3-6019b50203ae\") " pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.785073 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.977285 4907 generic.go:334] "Generic (PLEG): container finished" podID="01b123c0-d91d-4bed-8fd2-7931cbca4acb" containerID="5fabfddc0abae85e5ef3784a370d90ca5d4b37c7311426548969c5002abfd639" exitCode=0 Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.977352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9bt" event={"ID":"01b123c0-d91d-4bed-8fd2-7931cbca4acb","Type":"ContainerDied","Data":"5fabfddc0abae85e5ef3784a370d90ca5d4b37c7311426548969c5002abfd639"} Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.981512 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xbpfd"] Feb 26 15:50:54 crc kubenswrapper[4907]: I0226 15:50:54.989530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vcgkv" event={"ID":"012fe452-e0b5-4248-a110-8bf778e9595d","Type":"ContainerStarted","Data":"d387a274d16a9a4867c43fa45a83cf735b877f11387b398da26dc70e0ddf0e97"} Feb 26 15:50:55 crc kubenswrapper[4907]: W0226 15:50:55.003126 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12fc0143_8c96_4837_99ce_f5b7e447f10b.slice/crio-2420cf9cc4fe1c6449b7fd8f8ad845747732ce9db48297d0d8bd014ffbc056d0 WatchSource:0}: Error finding container 2420cf9cc4fe1c6449b7fd8f8ad845747732ce9db48297d0d8bd014ffbc056d0: Status 404 returned error can't find the container with id 2420cf9cc4fe1c6449b7fd8f8ad845747732ce9db48297d0d8bd014ffbc056d0 Feb 26 15:50:55 crc kubenswrapper[4907]: I0226 15:50:55.027435 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vcgkv" podStartSLOduration=2.62642008 podStartE2EDuration="4.027416942s" podCreationTimestamp="2026-02-26 15:50:51 +0000 UTC" firstStartedPulling="2026-02-26 15:50:52.963851042 +0000 UTC m=+515.482412881" lastFinishedPulling="2026-02-26 15:50:54.364847894 +0000 UTC m=+516.883409743" observedRunningTime="2026-02-26 15:50:55.025809222 +0000 UTC m=+517.544371061" watchObservedRunningTime="2026-02-26 15:50:55.027416942 +0000 UTC m=+517.545978791" Feb 26 15:50:55 crc kubenswrapper[4907]: I0226 15:50:55.226693 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xttzz"] Feb 26 15:50:55 crc kubenswrapper[4907]: W0226 15:50:55.235116 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf2526da_5738_4040_afe3_6019b50203ae.slice/crio-ef11d6ac008c04c0f1bcbaf74a84156720faa46d02037db4467f8618ad3eb2c6 WatchSource:0}: Error finding container ef11d6ac008c04c0f1bcbaf74a84156720faa46d02037db4467f8618ad3eb2c6: Status 404 returned error can't find the container with id ef11d6ac008c04c0f1bcbaf74a84156720faa46d02037db4467f8618ad3eb2c6 Feb 26 15:50:55 crc kubenswrapper[4907]: I0226 15:50:55.997290 4907 generic.go:334] "Generic (PLEG): container finished" podID="df2526da-5738-4040-afe3-6019b50203ae" containerID="72e0c6b5900d9fec4ce385923f54a12823ce7b6af4835f4925dd404131b9b6b9" exitCode=0 Feb 26 15:50:55 crc kubenswrapper[4907]: I0226 15:50:55.997388 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xttzz" event={"ID":"df2526da-5738-4040-afe3-6019b50203ae","Type":"ContainerDied","Data":"72e0c6b5900d9fec4ce385923f54a12823ce7b6af4835f4925dd404131b9b6b9"} Feb 26 15:50:55 crc kubenswrapper[4907]: I0226 15:50:55.999052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xttzz" event={"ID":"df2526da-5738-4040-afe3-6019b50203ae","Type":"ContainerStarted","Data":"ef11d6ac008c04c0f1bcbaf74a84156720faa46d02037db4467f8618ad3eb2c6"} Feb 26 15:50:56 crc kubenswrapper[4907]: I0226 15:50:56.006283 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mr9bt" event={"ID":"01b123c0-d91d-4bed-8fd2-7931cbca4acb","Type":"ContainerStarted","Data":"b17dcffdadbcfe10a7d41402abc3c5810c50b5a8a7e7fff05d1f6278ac09d5b8"} Feb 26 15:50:56 crc kubenswrapper[4907]: I0226 15:50:56.007508 4907 generic.go:334] "Generic (PLEG): container finished" podID="12fc0143-8c96-4837-99ce-f5b7e447f10b" containerID="b771c4dd6fd3ff78a6e0dc00e39469707473c4fce2d7613af125cef908ad4305" exitCode=0 Feb 26 15:50:56 crc kubenswrapper[4907]: I0226 15:50:56.007576 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbpfd" event={"ID":"12fc0143-8c96-4837-99ce-f5b7e447f10b","Type":"ContainerDied","Data":"b771c4dd6fd3ff78a6e0dc00e39469707473c4fce2d7613af125cef908ad4305"} Feb 26 15:50:56 crc kubenswrapper[4907]: I0226 15:50:56.007617 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbpfd" event={"ID":"12fc0143-8c96-4837-99ce-f5b7e447f10b","Type":"ContainerStarted","Data":"2420cf9cc4fe1c6449b7fd8f8ad845747732ce9db48297d0d8bd014ffbc056d0"} Feb 26 15:50:56 crc kubenswrapper[4907]: I0226 15:50:56.062692 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mr9bt" podStartSLOduration=1.63201994 podStartE2EDuration="4.06267397s" podCreationTimestamp="2026-02-26 15:50:52 +0000 UTC" firstStartedPulling="2026-02-26 15:50:52.962736464 +0000 UTC m=+515.481298353" lastFinishedPulling="2026-02-26 15:50:55.393390534 +0000 UTC m=+517.911952383" observedRunningTime="2026-02-26 15:50:56.06108828 +0000 UTC m=+518.579650129" watchObservedRunningTime="2026-02-26 15:50:56.06267397 +0000 UTC m=+518.581235819" Feb 26 15:50:57 crc kubenswrapper[4907]: I0226 15:50:57.016387 4907 generic.go:334] "Generic (PLEG): container finished" podID="12fc0143-8c96-4837-99ce-f5b7e447f10b" containerID="fdd23cba2ff93660aa6abf027fcb9873ddef6e684743ee6fb845347d96584dd1" exitCode=0 Feb 26 15:50:57 crc kubenswrapper[4907]: I0226 15:50:57.016485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbpfd" event={"ID":"12fc0143-8c96-4837-99ce-f5b7e447f10b","Type":"ContainerDied","Data":"fdd23cba2ff93660aa6abf027fcb9873ddef6e684743ee6fb845347d96584dd1"} Feb 26 15:50:57 crc kubenswrapper[4907]: I0226 15:50:57.023908 4907 generic.go:334] "Generic (PLEG): container finished" podID="df2526da-5738-4040-afe3-6019b50203ae" containerID="638e02232b59867c76bcfba22e41cc8e709ecd9b0528d4c2747a8718965b83dc" exitCode=0 Feb 26 15:50:57 crc kubenswrapper[4907]: I0226 15:50:57.023963 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xttzz" event={"ID":"df2526da-5738-4040-afe3-6019b50203ae","Type":"ContainerDied","Data":"638e02232b59867c76bcfba22e41cc8e709ecd9b0528d4c2747a8718965b83dc"} Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.031836 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xttzz" event={"ID":"df2526da-5738-4040-afe3-6019b50203ae","Type":"ContainerStarted","Data":"bda481b379bdbd85805c07a8c2ee2eb4b559711497fb140d0d305aa996c7c93d"} Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.034577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xbpfd" event={"ID":"12fc0143-8c96-4837-99ce-f5b7e447f10b","Type":"ContainerStarted","Data":"4dcad8d9d013faa76d64bcda7f46d3b970f3c480f72603613d6372a62d541260"} Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.061568 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xttzz" podStartSLOduration=2.604082887 podStartE2EDuration="4.061543152s" podCreationTimestamp="2026-02-26 15:50:54 +0000 UTC" firstStartedPulling="2026-02-26 15:50:55.998519295 +0000 UTC m=+518.517081144" lastFinishedPulling="2026-02-26 15:50:57.45597956 +0000 UTC m=+519.974541409" observedRunningTime="2026-02-26 15:50:58.053668215 +0000 UTC m=+520.572230074" watchObservedRunningTime="2026-02-26 15:50:58.061543152 +0000 UTC m=+520.580105001" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.076921 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xbpfd" podStartSLOduration=2.654646071 podStartE2EDuration="4.076901986s" podCreationTimestamp="2026-02-26 15:50:54 +0000 UTC" firstStartedPulling="2026-02-26 15:50:56.008772141 +0000 UTC m=+518.527333990" lastFinishedPulling="2026-02-26 15:50:57.431028056 +0000 UTC m=+519.949589905" observedRunningTime="2026-02-26 15:50:58.073799108 +0000 UTC m=+520.592360957" watchObservedRunningTime="2026-02-26 15:50:58.076901986 +0000 UTC m=+520.595463845" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.249337 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" podUID="0fefaf3e-d327-41f8-bbbe-94b051a63b19" containerName="registry" containerID="cri-o://fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd" gracePeriod=30 Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.659185 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.729933 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.729997 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-certificates\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.730027 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4l8g\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-kube-api-access-h4l8g\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.730083 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-trusted-ca\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.730113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-tls\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.730145 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fefaf3e-d327-41f8-bbbe-94b051a63b19-installation-pull-secrets\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.730174 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fefaf3e-d327-41f8-bbbe-94b051a63b19-ca-trust-extracted\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.730201 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-bound-sa-token\") pod \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\" (UID: \"0fefaf3e-d327-41f8-bbbe-94b051a63b19\") " Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.731677 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.731707 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.739351 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.739460 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.739541 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-kube-api-access-h4l8g" (OuterVolumeSpecName: "kube-api-access-h4l8g") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "kube-api-access-h4l8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.749620 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.750776 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fefaf3e-d327-41f8-bbbe-94b051a63b19-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.759753 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fefaf3e-d327-41f8-bbbe-94b051a63b19-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0fefaf3e-d327-41f8-bbbe-94b051a63b19" (UID: "0fefaf3e-d327-41f8-bbbe-94b051a63b19"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831631 4907 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831670 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4l8g\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-kube-api-access-h4l8g\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831704 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0fefaf3e-d327-41f8-bbbe-94b051a63b19-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831723 4907 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831732 4907 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0fefaf3e-d327-41f8-bbbe-94b051a63b19-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831741 4907 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0fefaf3e-d327-41f8-bbbe-94b051a63b19-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:58 crc kubenswrapper[4907]: I0226 15:50:58.831749 4907 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0fefaf3e-d327-41f8-bbbe-94b051a63b19-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.041941 4907 generic.go:334] "Generic (PLEG): container finished" podID="0fefaf3e-d327-41f8-bbbe-94b051a63b19" containerID="fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd" exitCode=0 Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.041991 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.042015 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" event={"ID":"0fefaf3e-d327-41f8-bbbe-94b051a63b19","Type":"ContainerDied","Data":"fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd"} Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.042054 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kqtml" event={"ID":"0fefaf3e-d327-41f8-bbbe-94b051a63b19","Type":"ContainerDied","Data":"5e4bedb35215aa589170c696338aa1213956872fc1adf190eeb23b94a8c5bc35"} Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.042078 4907 scope.go:117] "RemoveContainer" containerID="fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd" Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.058371 4907 scope.go:117] "RemoveContainer" containerID="fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd" Feb 26 15:50:59 crc kubenswrapper[4907]: E0226 15:50:59.059460 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd\": container with ID starting with fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd not found: ID does not exist" containerID="fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd" Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.059498 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd"} err="failed to get container status \"fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd\": rpc error: code = NotFound desc = could not find container \"fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd\": container with ID starting with fbe7bc3480d104a83cb2bef15d1509d69caca10cb0485cfd980ffd68be5102bd not found: ID does not exist" Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.077817 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kqtml"] Feb 26 15:50:59 crc kubenswrapper[4907]: I0226 15:50:59.083740 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kqtml"] Feb 26 15:51:00 crc kubenswrapper[4907]: I0226 15:51:00.134885 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fefaf3e-d327-41f8-bbbe-94b051a63b19" path="/var/lib/kubelet/pods/0fefaf3e-d327-41f8-bbbe-94b051a63b19/volumes" Feb 26 15:51:02 crc kubenswrapper[4907]: I0226 15:51:02.170030 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:51:02 crc kubenswrapper[4907]: I0226 15:51:02.170429 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:51:02 crc kubenswrapper[4907]: I0226 15:51:02.239753 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:51:02 crc kubenswrapper[4907]: I0226 15:51:02.378632 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:51:02 crc kubenswrapper[4907]: I0226 15:51:02.378693 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:51:02 crc kubenswrapper[4907]: I0226 15:51:02.462064 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:51:03 crc kubenswrapper[4907]: I0226 15:51:03.137539 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vcgkv" Feb 26 15:51:03 crc kubenswrapper[4907]: I0226 15:51:03.140733 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mr9bt" Feb 26 15:51:04 crc kubenswrapper[4907]: I0226 15:51:04.560656 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:51:04 crc kubenswrapper[4907]: I0226 15:51:04.561231 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:51:04 crc kubenswrapper[4907]: I0226 15:51:04.634700 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:51:04 crc kubenswrapper[4907]: I0226 15:51:04.786064 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:51:04 crc kubenswrapper[4907]: I0226 15:51:04.786159 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:51:04 crc kubenswrapper[4907]: I0226 15:51:04.842152 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:51:05 crc kubenswrapper[4907]: I0226 15:51:05.131204 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xttzz" Feb 26 15:51:05 crc kubenswrapper[4907]: I0226 15:51:05.141212 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xbpfd" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.151460 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535352-g24tn"] Feb 26 15:52:00 crc kubenswrapper[4907]: E0226 15:52:00.152693 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fefaf3e-d327-41f8-bbbe-94b051a63b19" containerName="registry" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.152726 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fefaf3e-d327-41f8-bbbe-94b051a63b19" containerName="registry" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.152964 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fefaf3e-d327-41f8-bbbe-94b051a63b19" containerName="registry" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.153870 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.155768 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.158844 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.158956 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.165445 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-g24tn"] Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.341895 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tw8w\" (UniqueName: \"kubernetes.io/projected/76681648-110a-4f27-a62c-1e4c06da6564-kube-api-access-4tw8w\") pod \"auto-csr-approver-29535352-g24tn\" (UID: \"76681648-110a-4f27-a62c-1e4c06da6564\") " pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.443445 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tw8w\" (UniqueName: \"kubernetes.io/projected/76681648-110a-4f27-a62c-1e4c06da6564-kube-api-access-4tw8w\") pod \"auto-csr-approver-29535352-g24tn\" (UID: \"76681648-110a-4f27-a62c-1e4c06da6564\") " pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.472816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tw8w\" (UniqueName: \"kubernetes.io/projected/76681648-110a-4f27-a62c-1e4c06da6564-kube-api-access-4tw8w\") pod \"auto-csr-approver-29535352-g24tn\" (UID: \"76681648-110a-4f27-a62c-1e4c06da6564\") " pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.473067 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.947320 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-g24tn"] Feb 26 15:52:00 crc kubenswrapper[4907]: I0226 15:52:00.959219 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 15:52:01 crc kubenswrapper[4907]: I0226 15:52:01.637570 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535352-g24tn" event={"ID":"76681648-110a-4f27-a62c-1e4c06da6564","Type":"ContainerStarted","Data":"e9efe7c80bb8fa6f26f4a1c4b019db4b5256893b5bf64dfb4d65bde1dd481373"} Feb 26 15:52:02 crc kubenswrapper[4907]: I0226 15:52:02.645444 4907 generic.go:334] "Generic (PLEG): container finished" podID="76681648-110a-4f27-a62c-1e4c06da6564" containerID="1a0c054792c5c726f79413f5f09de926e52ff9f77ea5855b8d3c35b09b90a4c4" exitCode=0 Feb 26 15:52:02 crc kubenswrapper[4907]: I0226 15:52:02.645488 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535352-g24tn" event={"ID":"76681648-110a-4f27-a62c-1e4c06da6564","Type":"ContainerDied","Data":"1a0c054792c5c726f79413f5f09de926e52ff9f77ea5855b8d3c35b09b90a4c4"} Feb 26 15:52:03 crc kubenswrapper[4907]: I0226 15:52:03.926676 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.089431 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tw8w\" (UniqueName: \"kubernetes.io/projected/76681648-110a-4f27-a62c-1e4c06da6564-kube-api-access-4tw8w\") pod \"76681648-110a-4f27-a62c-1e4c06da6564\" (UID: \"76681648-110a-4f27-a62c-1e4c06da6564\") " Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.097902 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76681648-110a-4f27-a62c-1e4c06da6564-kube-api-access-4tw8w" (OuterVolumeSpecName: "kube-api-access-4tw8w") pod "76681648-110a-4f27-a62c-1e4c06da6564" (UID: "76681648-110a-4f27-a62c-1e4c06da6564"). InnerVolumeSpecName "kube-api-access-4tw8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.191075 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tw8w\" (UniqueName: \"kubernetes.io/projected/76681648-110a-4f27-a62c-1e4c06da6564-kube-api-access-4tw8w\") on node \"crc\" DevicePath \"\"" Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.659516 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535352-g24tn" Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.659426 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535352-g24tn" event={"ID":"76681648-110a-4f27-a62c-1e4c06da6564","Type":"ContainerDied","Data":"e9efe7c80bb8fa6f26f4a1c4b019db4b5256893b5bf64dfb4d65bde1dd481373"} Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.660361 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9efe7c80bb8fa6f26f4a1c4b019db4b5256893b5bf64dfb4d65bde1dd481373" Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.988213 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-hhrww"] Feb 26 15:52:04 crc kubenswrapper[4907]: I0226 15:52:04.991311 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535346-hhrww"] Feb 26 15:52:06 crc kubenswrapper[4907]: I0226 15:52:06.133210 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6986b68-4a8d-4677-bed1-493eb1a231c3" path="/var/lib/kubelet/pods/c6986b68-4a8d-4677-bed1-493eb1a231c3/volumes" Feb 26 15:52:48 crc kubenswrapper[4907]: I0226 15:52:48.530461 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:52:48 crc kubenswrapper[4907]: I0226 15:52:48.531052 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:53:18 crc kubenswrapper[4907]: I0226 15:53:18.530525 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:53:18 crc kubenswrapper[4907]: I0226 15:53:18.531040 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:53:48 crc kubenswrapper[4907]: I0226 15:53:48.529860 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:53:48 crc kubenswrapper[4907]: I0226 15:53:48.530431 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:53:48 crc kubenswrapper[4907]: I0226 15:53:48.530471 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:53:48 crc kubenswrapper[4907]: I0226 15:53:48.530997 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"53be863c74815dd43aa6d07eb234f8fc9300124de620faba3fc31d92226518b6"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:53:48 crc kubenswrapper[4907]: I0226 15:53:48.531052 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://53be863c74815dd43aa6d07eb234f8fc9300124de620faba3fc31d92226518b6" gracePeriod=600 Feb 26 15:53:49 crc kubenswrapper[4907]: I0226 15:53:49.357718 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="53be863c74815dd43aa6d07eb234f8fc9300124de620faba3fc31d92226518b6" exitCode=0 Feb 26 15:53:49 crc kubenswrapper[4907]: I0226 15:53:49.357797 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"53be863c74815dd43aa6d07eb234f8fc9300124de620faba3fc31d92226518b6"} Feb 26 15:53:49 crc kubenswrapper[4907]: I0226 15:53:49.358113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"135e9e11cfbaabe55bbe34848f747e715822492af89a2d18c459beb482f280c0"} Feb 26 15:53:49 crc kubenswrapper[4907]: I0226 15:53:49.358142 4907 scope.go:117] "RemoveContainer" containerID="5b5fce09e6f67f86221daea08fdd5259aaa4024d9dbe5e7a76056c4c092f3ec2" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.151328 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535354-x5ltf"] Feb 26 15:54:00 crc kubenswrapper[4907]: E0226 15:54:00.152080 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76681648-110a-4f27-a62c-1e4c06da6564" containerName="oc" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.152092 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="76681648-110a-4f27-a62c-1e4c06da6564" containerName="oc" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.152172 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="76681648-110a-4f27-a62c-1e4c06da6564" containerName="oc" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.152525 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.157167 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.157313 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.159354 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.164893 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-x5ltf"] Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.299235 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll52n\" (UniqueName: \"kubernetes.io/projected/f905c87c-9059-47e4-918a-b54f36ec1195-kube-api-access-ll52n\") pod \"auto-csr-approver-29535354-x5ltf\" (UID: \"f905c87c-9059-47e4-918a-b54f36ec1195\") " pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.400970 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll52n\" (UniqueName: \"kubernetes.io/projected/f905c87c-9059-47e4-918a-b54f36ec1195-kube-api-access-ll52n\") pod \"auto-csr-approver-29535354-x5ltf\" (UID: \"f905c87c-9059-47e4-918a-b54f36ec1195\") " pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.438807 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll52n\" (UniqueName: \"kubernetes.io/projected/f905c87c-9059-47e4-918a-b54f36ec1195-kube-api-access-ll52n\") pod \"auto-csr-approver-29535354-x5ltf\" (UID: \"f905c87c-9059-47e4-918a-b54f36ec1195\") " pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.489420 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:00 crc kubenswrapper[4907]: I0226 15:54:00.725376 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-x5ltf"] Feb 26 15:54:01 crc kubenswrapper[4907]: I0226 15:54:01.447183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" event={"ID":"f905c87c-9059-47e4-918a-b54f36ec1195","Type":"ContainerStarted","Data":"a99fa1e3a0a8809d20056ac5c8ca97c4d841d72bd2ab439ac6fb9abc9c919247"} Feb 26 15:54:02 crc kubenswrapper[4907]: I0226 15:54:02.453694 4907 generic.go:334] "Generic (PLEG): container finished" podID="f905c87c-9059-47e4-918a-b54f36ec1195" containerID="8807b9d17fb108cc008ef775f45367d18312cad71c70bb4b5cb43eafc391d7df" exitCode=0 Feb 26 15:54:02 crc kubenswrapper[4907]: I0226 15:54:02.453764 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" event={"ID":"f905c87c-9059-47e4-918a-b54f36ec1195","Type":"ContainerDied","Data":"8807b9d17fb108cc008ef775f45367d18312cad71c70bb4b5cb43eafc391d7df"} Feb 26 15:54:03 crc kubenswrapper[4907]: I0226 15:54:03.659803 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:03 crc kubenswrapper[4907]: I0226 15:54:03.847388 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll52n\" (UniqueName: \"kubernetes.io/projected/f905c87c-9059-47e4-918a-b54f36ec1195-kube-api-access-ll52n\") pod \"f905c87c-9059-47e4-918a-b54f36ec1195\" (UID: \"f905c87c-9059-47e4-918a-b54f36ec1195\") " Feb 26 15:54:03 crc kubenswrapper[4907]: I0226 15:54:03.852240 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f905c87c-9059-47e4-918a-b54f36ec1195-kube-api-access-ll52n" (OuterVolumeSpecName: "kube-api-access-ll52n") pod "f905c87c-9059-47e4-918a-b54f36ec1195" (UID: "f905c87c-9059-47e4-918a-b54f36ec1195"). InnerVolumeSpecName "kube-api-access-ll52n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:54:03 crc kubenswrapper[4907]: I0226 15:54:03.948723 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll52n\" (UniqueName: \"kubernetes.io/projected/f905c87c-9059-47e4-918a-b54f36ec1195-kube-api-access-ll52n\") on node \"crc\" DevicePath \"\"" Feb 26 15:54:04 crc kubenswrapper[4907]: I0226 15:54:04.467795 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" event={"ID":"f905c87c-9059-47e4-918a-b54f36ec1195","Type":"ContainerDied","Data":"a99fa1e3a0a8809d20056ac5c8ca97c4d841d72bd2ab439ac6fb9abc9c919247"} Feb 26 15:54:04 crc kubenswrapper[4907]: I0226 15:54:04.467856 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a99fa1e3a0a8809d20056ac5c8ca97c4d841d72bd2ab439ac6fb9abc9c919247" Feb 26 15:54:04 crc kubenswrapper[4907]: I0226 15:54:04.467861 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535354-x5ltf" Feb 26 15:54:04 crc kubenswrapper[4907]: I0226 15:54:04.728992 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-8k2tp"] Feb 26 15:54:04 crc kubenswrapper[4907]: I0226 15:54:04.733759 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535348-8k2tp"] Feb 26 15:54:06 crc kubenswrapper[4907]: I0226 15:54:06.136949 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e761f1c-0a31-49e0-aee3-2ecd184291dc" path="/var/lib/kubelet/pods/6e761f1c-0a31-49e0-aee3-2ecd184291dc/volumes" Feb 26 15:54:22 crc kubenswrapper[4907]: I0226 15:54:22.573790 4907 scope.go:117] "RemoveContainer" containerID="b4f9fde9bfda905320bf8cf8897c11cf190969f123ff5644b5c3c62ce53613c4" Feb 26 15:54:22 crc kubenswrapper[4907]: I0226 15:54:22.595293 4907 scope.go:117] "RemoveContainer" containerID="5d03c4417c6bd60e984baf42dd9736b039ec92570e55f16ae134bc81394e032d" Feb 26 15:54:22 crc kubenswrapper[4907]: I0226 15:54:22.649193 4907 scope.go:117] "RemoveContainer" containerID="85ab84dfe988254bcbc6f434e907e2b18a8672cf99d2da69c404eba2f5afbaf8" Feb 26 15:54:22 crc kubenswrapper[4907]: I0226 15:54:22.685185 4907 scope.go:117] "RemoveContainer" containerID="78f0f5bcde8332a66f2bd4defbe69a2ebd04385376ea98cbf6d4028de8d7dd06" Feb 26 15:55:48 crc kubenswrapper[4907]: I0226 15:55:48.530092 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:55:48 crc kubenswrapper[4907]: I0226 15:55:48.531219 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.138960 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535356-g8895"] Feb 26 15:56:00 crc kubenswrapper[4907]: E0226 15:56:00.139771 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f905c87c-9059-47e4-918a-b54f36ec1195" containerName="oc" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.139786 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f905c87c-9059-47e4-918a-b54f36ec1195" containerName="oc" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.139902 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f905c87c-9059-47e4-918a-b54f36ec1195" containerName="oc" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.140274 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.146292 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.146442 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.146529 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.157819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-g8895"] Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.254475 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5m8k\" (UniqueName: \"kubernetes.io/projected/6d0a628a-4c00-4b3a-8710-9ac6a6880844-kube-api-access-q5m8k\") pod \"auto-csr-approver-29535356-g8895\" (UID: \"6d0a628a-4c00-4b3a-8710-9ac6a6880844\") " pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.356418 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5m8k\" (UniqueName: \"kubernetes.io/projected/6d0a628a-4c00-4b3a-8710-9ac6a6880844-kube-api-access-q5m8k\") pod \"auto-csr-approver-29535356-g8895\" (UID: \"6d0a628a-4c00-4b3a-8710-9ac6a6880844\") " pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.379826 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5m8k\" (UniqueName: \"kubernetes.io/projected/6d0a628a-4c00-4b3a-8710-9ac6a6880844-kube-api-access-q5m8k\") pod \"auto-csr-approver-29535356-g8895\" (UID: \"6d0a628a-4c00-4b3a-8710-9ac6a6880844\") " pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.457127 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:00 crc kubenswrapper[4907]: I0226 15:56:00.862865 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-g8895"] Feb 26 15:56:01 crc kubenswrapper[4907]: I0226 15:56:01.300549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-g8895" event={"ID":"6d0a628a-4c00-4b3a-8710-9ac6a6880844","Type":"ContainerStarted","Data":"c0c3f61eae3920dd8003f3e0d2e0c65b92ad38c11e9e5c3a67ddc55fd2784419"} Feb 26 15:56:02 crc kubenswrapper[4907]: I0226 15:56:02.307584 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-g8895" event={"ID":"6d0a628a-4c00-4b3a-8710-9ac6a6880844","Type":"ContainerStarted","Data":"fe9ed9aadb41ed6f130e39ddcdfc955305afcaf1edfdb286a96007db514190d7"} Feb 26 15:56:02 crc kubenswrapper[4907]: I0226 15:56:02.322176 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535356-g8895" podStartSLOduration=1.209571153 podStartE2EDuration="2.322154224s" podCreationTimestamp="2026-02-26 15:56:00 +0000 UTC" firstStartedPulling="2026-02-26 15:56:00.870806765 +0000 UTC m=+823.389368624" lastFinishedPulling="2026-02-26 15:56:01.983389836 +0000 UTC m=+824.501951695" observedRunningTime="2026-02-26 15:56:02.321217851 +0000 UTC m=+824.839779710" watchObservedRunningTime="2026-02-26 15:56:02.322154224 +0000 UTC m=+824.840716073" Feb 26 15:56:03 crc kubenswrapper[4907]: I0226 15:56:03.319008 4907 generic.go:334] "Generic (PLEG): container finished" podID="6d0a628a-4c00-4b3a-8710-9ac6a6880844" containerID="fe9ed9aadb41ed6f130e39ddcdfc955305afcaf1edfdb286a96007db514190d7" exitCode=0 Feb 26 15:56:03 crc kubenswrapper[4907]: I0226 15:56:03.319091 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-g8895" event={"ID":"6d0a628a-4c00-4b3a-8710-9ac6a6880844","Type":"ContainerDied","Data":"fe9ed9aadb41ed6f130e39ddcdfc955305afcaf1edfdb286a96007db514190d7"} Feb 26 15:56:04 crc kubenswrapper[4907]: I0226 15:56:04.649794 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:04 crc kubenswrapper[4907]: I0226 15:56:04.712754 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5m8k\" (UniqueName: \"kubernetes.io/projected/6d0a628a-4c00-4b3a-8710-9ac6a6880844-kube-api-access-q5m8k\") pod \"6d0a628a-4c00-4b3a-8710-9ac6a6880844\" (UID: \"6d0a628a-4c00-4b3a-8710-9ac6a6880844\") " Feb 26 15:56:04 crc kubenswrapper[4907]: I0226 15:56:04.720782 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0a628a-4c00-4b3a-8710-9ac6a6880844-kube-api-access-q5m8k" (OuterVolumeSpecName: "kube-api-access-q5m8k") pod "6d0a628a-4c00-4b3a-8710-9ac6a6880844" (UID: "6d0a628a-4c00-4b3a-8710-9ac6a6880844"). InnerVolumeSpecName "kube-api-access-q5m8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:56:04 crc kubenswrapper[4907]: I0226 15:56:04.813899 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5m8k\" (UniqueName: \"kubernetes.io/projected/6d0a628a-4c00-4b3a-8710-9ac6a6880844-kube-api-access-q5m8k\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:05 crc kubenswrapper[4907]: I0226 15:56:05.337703 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535356-g8895" event={"ID":"6d0a628a-4c00-4b3a-8710-9ac6a6880844","Type":"ContainerDied","Data":"c0c3f61eae3920dd8003f3e0d2e0c65b92ad38c11e9e5c3a67ddc55fd2784419"} Feb 26 15:56:05 crc kubenswrapper[4907]: I0226 15:56:05.337751 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0c3f61eae3920dd8003f3e0d2e0c65b92ad38c11e9e5c3a67ddc55fd2784419" Feb 26 15:56:05 crc kubenswrapper[4907]: I0226 15:56:05.338195 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535356-g8895" Feb 26 15:56:05 crc kubenswrapper[4907]: I0226 15:56:05.400667 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-924dl"] Feb 26 15:56:05 crc kubenswrapper[4907]: I0226 15:56:05.405499 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535350-924dl"] Feb 26 15:56:06 crc kubenswrapper[4907]: I0226 15:56:06.141376 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20256617-55f3-4228-8200-bd57793ff553" path="/var/lib/kubelet/pods/20256617-55f3-4228-8200-bd57793ff553/volumes" Feb 26 15:56:18 crc kubenswrapper[4907]: I0226 15:56:18.530339 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:56:18 crc kubenswrapper[4907]: I0226 15:56:18.530930 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.952358 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq"] Feb 26 15:56:19 crc kubenswrapper[4907]: E0226 15:56:19.952833 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0a628a-4c00-4b3a-8710-9ac6a6880844" containerName="oc" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.952846 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0a628a-4c00-4b3a-8710-9ac6a6880844" containerName="oc" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.952954 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0a628a-4c00-4b3a-8710-9ac6a6880844" containerName="oc" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.953275 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.957458 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.957975 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-v448j" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.958188 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.972234 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq"] Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.988034 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-v6vbf"] Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.988686 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-v6vbf" Feb 26 15:56:19 crc kubenswrapper[4907]: I0226 15:56:19.992300 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-tjwjt" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.005627 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-v6vbf"] Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.015317 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hdhr9"] Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.016244 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.016337 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-744nm\" (UniqueName: \"kubernetes.io/projected/1e1d1a02-d13e-4410-8762-ffa52da94db0-kube-api-access-744nm\") pod \"cert-manager-858654f9db-v6vbf\" (UID: \"1e1d1a02-d13e-4410-8762-ffa52da94db0\") " pod="cert-manager/cert-manager-858654f9db-v6vbf" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.016441 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjwp5\" (UniqueName: \"kubernetes.io/projected/2a995506-4e43-40d2-8e85-720648605979-kube-api-access-pjwp5\") pod \"cert-manager-cainjector-cf98fcc89-8lnvq\" (UID: \"2a995506-4e43-40d2-8e85-720648605979\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.021057 4907 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-6qxgz" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.036493 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hdhr9"] Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.117880 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjwp5\" (UniqueName: \"kubernetes.io/projected/2a995506-4e43-40d2-8e85-720648605979-kube-api-access-pjwp5\") pod \"cert-manager-cainjector-cf98fcc89-8lnvq\" (UID: \"2a995506-4e43-40d2-8e85-720648605979\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.117925 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-744nm\" (UniqueName: \"kubernetes.io/projected/1e1d1a02-d13e-4410-8762-ffa52da94db0-kube-api-access-744nm\") pod \"cert-manager-858654f9db-v6vbf\" (UID: \"1e1d1a02-d13e-4410-8762-ffa52da94db0\") " pod="cert-manager/cert-manager-858654f9db-v6vbf" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.117975 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4gt\" (UniqueName: \"kubernetes.io/projected/177f40d7-0ed3-43d9-b8db-148511ab9065-kube-api-access-ps4gt\") pod \"cert-manager-webhook-687f57d79b-hdhr9\" (UID: \"177f40d7-0ed3-43d9-b8db-148511ab9065\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.141913 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-744nm\" (UniqueName: \"kubernetes.io/projected/1e1d1a02-d13e-4410-8762-ffa52da94db0-kube-api-access-744nm\") pod \"cert-manager-858654f9db-v6vbf\" (UID: \"1e1d1a02-d13e-4410-8762-ffa52da94db0\") " pod="cert-manager/cert-manager-858654f9db-v6vbf" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.150225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjwp5\" (UniqueName: \"kubernetes.io/projected/2a995506-4e43-40d2-8e85-720648605979-kube-api-access-pjwp5\") pod \"cert-manager-cainjector-cf98fcc89-8lnvq\" (UID: \"2a995506-4e43-40d2-8e85-720648605979\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.218968 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4gt\" (UniqueName: \"kubernetes.io/projected/177f40d7-0ed3-43d9-b8db-148511ab9065-kube-api-access-ps4gt\") pod \"cert-manager-webhook-687f57d79b-hdhr9\" (UID: \"177f40d7-0ed3-43d9-b8db-148511ab9065\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.239899 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4gt\" (UniqueName: \"kubernetes.io/projected/177f40d7-0ed3-43d9-b8db-148511ab9065-kube-api-access-ps4gt\") pod \"cert-manager-webhook-687f57d79b-hdhr9\" (UID: \"177f40d7-0ed3-43d9-b8db-148511ab9065\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.270436 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.308961 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-v6vbf" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.360274 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.493761 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq"] Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.589738 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-v6vbf"] Feb 26 15:56:20 crc kubenswrapper[4907]: W0226 15:56:20.594074 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e1d1a02_d13e_4410_8762_ffa52da94db0.slice/crio-65bc791d177adb8cfd6b208db20f1554a69e00715e4cbe8cd05d8a12ad453585 WatchSource:0}: Error finding container 65bc791d177adb8cfd6b208db20f1554a69e00715e4cbe8cd05d8a12ad453585: Status 404 returned error can't find the container with id 65bc791d177adb8cfd6b208db20f1554a69e00715e4cbe8cd05d8a12ad453585 Feb 26 15:56:20 crc kubenswrapper[4907]: I0226 15:56:20.620330 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hdhr9"] Feb 26 15:56:20 crc kubenswrapper[4907]: W0226 15:56:20.624353 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod177f40d7_0ed3_43d9_b8db_148511ab9065.slice/crio-173bd17b87ef0d59281e050b3357516caf690d7e07da708bb08284b12a4dde9e WatchSource:0}: Error finding container 173bd17b87ef0d59281e050b3357516caf690d7e07da708bb08284b12a4dde9e: Status 404 returned error can't find the container with id 173bd17b87ef0d59281e050b3357516caf690d7e07da708bb08284b12a4dde9e Feb 26 15:56:21 crc kubenswrapper[4907]: I0226 15:56:21.457549 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-v6vbf" event={"ID":"1e1d1a02-d13e-4410-8762-ffa52da94db0","Type":"ContainerStarted","Data":"65bc791d177adb8cfd6b208db20f1554a69e00715e4cbe8cd05d8a12ad453585"} Feb 26 15:56:21 crc kubenswrapper[4907]: I0226 15:56:21.461367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" event={"ID":"177f40d7-0ed3-43d9-b8db-148511ab9065","Type":"ContainerStarted","Data":"173bd17b87ef0d59281e050b3357516caf690d7e07da708bb08284b12a4dde9e"} Feb 26 15:56:21 crc kubenswrapper[4907]: I0226 15:56:21.464397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" event={"ID":"2a995506-4e43-40d2-8e85-720648605979","Type":"ContainerStarted","Data":"310f8cb56520ed126f9e269cf0c5688156cccbd80ebca4d36dfd28a85c0934d1"} Feb 26 15:56:22 crc kubenswrapper[4907]: I0226 15:56:22.760754 4907 scope.go:117] "RemoveContainer" containerID="cf3f3c1e48222cd514f503b65cc470d33e7c3a75035ab2d9d3af36adfe22dc4a" Feb 26 15:56:23 crc kubenswrapper[4907]: I0226 15:56:23.481455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" event={"ID":"2a995506-4e43-40d2-8e85-720648605979","Type":"ContainerStarted","Data":"33bb006a5c9dc102ed03f7bfcf88371a481a48b6925e281788c4321c75142be4"} Feb 26 15:56:24 crc kubenswrapper[4907]: I0226 15:56:24.503109 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" event={"ID":"177f40d7-0ed3-43d9-b8db-148511ab9065","Type":"ContainerStarted","Data":"cd7223cbca1f5306f5892c4be1310843d9ea76cafa281fb0f47ab949604059a0"} Feb 26 15:56:24 crc kubenswrapper[4907]: I0226 15:56:24.503624 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:24 crc kubenswrapper[4907]: I0226 15:56:24.506018 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-v6vbf" event={"ID":"1e1d1a02-d13e-4410-8762-ffa52da94db0","Type":"ContainerStarted","Data":"9ca5b8a944f5491aa31727c784808db2edcc5e10690e26782a502eaf6c46d990"} Feb 26 15:56:24 crc kubenswrapper[4907]: I0226 15:56:24.522041 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" podStartSLOduration=2.209101945 podStartE2EDuration="5.52201876s" podCreationTimestamp="2026-02-26 15:56:19 +0000 UTC" firstStartedPulling="2026-02-26 15:56:20.631857686 +0000 UTC m=+843.150419535" lastFinishedPulling="2026-02-26 15:56:23.944774461 +0000 UTC m=+846.463336350" observedRunningTime="2026-02-26 15:56:24.51814004 +0000 UTC m=+847.036701889" watchObservedRunningTime="2026-02-26 15:56:24.52201876 +0000 UTC m=+847.040580609" Feb 26 15:56:24 crc kubenswrapper[4907]: I0226 15:56:24.527914 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8lnvq" podStartSLOduration=3.422617083 podStartE2EDuration="5.527897491s" podCreationTimestamp="2026-02-26 15:56:19 +0000 UTC" firstStartedPulling="2026-02-26 15:56:20.505514642 +0000 UTC m=+843.024076491" lastFinishedPulling="2026-02-26 15:56:22.6107949 +0000 UTC m=+845.129356899" observedRunningTime="2026-02-26 15:56:23.498352583 +0000 UTC m=+846.016914432" watchObservedRunningTime="2026-02-26 15:56:24.527897491 +0000 UTC m=+847.046459340" Feb 26 15:56:24 crc kubenswrapper[4907]: I0226 15:56:24.549261 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-v6vbf" podStartSLOduration=2.2618642270000002 podStartE2EDuration="5.549238632s" podCreationTimestamp="2026-02-26 15:56:19 +0000 UTC" firstStartedPulling="2026-02-26 15:56:20.596933277 +0000 UTC m=+843.115495126" lastFinishedPulling="2026-02-26 15:56:23.884307682 +0000 UTC m=+846.402869531" observedRunningTime="2026-02-26 15:56:24.539781708 +0000 UTC m=+847.058343557" watchObservedRunningTime="2026-02-26 15:56:24.549238632 +0000 UTC m=+847.067800491" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269137 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsvsw"] Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269571 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-controller" containerID="cri-o://eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269658 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="northd" containerID="cri-o://9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269732 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-acl-logging" containerID="cri-o://17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269696 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="sbdb" containerID="cri-o://cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269756 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="nbdb" containerID="cri-o://800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269803 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-node" containerID="cri-o://c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.269860 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.311751 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" containerID="cri-o://117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3" gracePeriod=30 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.512393 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/2.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.512783 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/1.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.512827 4907 generic.go:334] "Generic (PLEG): container finished" podID="51024bd5-00ff-4e2f-927c-8c989b59d7be" containerID="7485dceccdb2068136cd7e452af5b857fbf4a0321439464c6d537dffff0f08bb" exitCode=2 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.512892 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerDied","Data":"7485dceccdb2068136cd7e452af5b857fbf4a0321439464c6d537dffff0f08bb"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.512943 4907 scope.go:117] "RemoveContainer" containerID="e822f482000a6645405c4c5b3b74d28302ababcc6de59c9d2f392c08d1fd092f" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.513472 4907 scope.go:117] "RemoveContainer" containerID="7485dceccdb2068136cd7e452af5b857fbf4a0321439464c6d537dffff0f08bb" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.519973 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovnkube-controller/3.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.523939 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovn-acl-logging/0.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524521 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovn-controller/0.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524895 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3" exitCode=0 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524916 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df" exitCode=0 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524923 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921" exitCode=0 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524930 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d" exitCode=0 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524936 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89" exitCode=143 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524946 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487" exitCode=143 Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.524983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.525029 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.525040 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.525049 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.525058 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.525066 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487"} Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.571977 4907 scope.go:117] "RemoveContainer" containerID="51787a0de7c6993ba3bfd70265cc1718966209053e9703f5fc5b039f3d78abae" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.610184 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovn-acl-logging/0.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.611094 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovn-controller/0.log" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.611534 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661083 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m765d"] Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661267 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="northd" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661278 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="northd" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661291 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661297 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661305 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661310 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661317 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-node" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661323 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-node" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661332 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-acl-logging" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661338 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-acl-logging" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661348 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kubecfg-setup" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661354 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kubecfg-setup" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661364 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="sbdb" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661369 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="sbdb" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661377 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="nbdb" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661382 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="nbdb" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661389 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661395 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661400 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661406 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661413 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661419 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661499 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="northd" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661509 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661517 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="nbdb" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661525 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="sbdb" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661532 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="kube-rbac-proxy-node" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661540 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661546 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661552 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-acl-logging" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661563 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661570 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661580 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovn-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661681 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661689 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: E0226 15:56:25.661697 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661702 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.661788 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerName="ovnkube-controller" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.663218 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.694957 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-etc-openvswitch\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695017 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-ovn-kubernetes\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695041 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-log-socket\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695059 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-systemd-units\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695085 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-env-overrides\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-node-log\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695125 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-slash\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-systemd\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695160 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695189 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-openvswitch\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovn-node-metrics-cert\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695221 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-netd\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695239 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-bin\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695304 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-kubelet\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695319 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-script-lib\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695340 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hmb7\" (UniqueName: \"kubernetes.io/projected/49ee65e1-8667-4ad7-a403-c899f0cc6a70-kube-api-access-7hmb7\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695357 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-config\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-var-lib-openvswitch\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695411 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-netns\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695428 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-ovn\") pod \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\" (UID: \"49ee65e1-8667-4ad7-a403-c899f0cc6a70\") " Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695545 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-node-log\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-kubelet\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695609 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-etc-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695633 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-run-netns\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695649 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-var-lib-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695668 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695684 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-cni-netd\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695704 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-slash\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-systemd\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695736 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-log-socket\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-ovn\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695788 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-env-overrides\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695807 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ce2849-824c-4c72-a86c-d0128e548d92-ovn-node-metrics-cert\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695826 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-ovnkube-script-lib\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695858 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-run-ovn-kubernetes\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695881 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695898 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-cni-bin\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695918 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-ovnkube-config\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-systemd-units\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.695952 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfs5w\" (UniqueName: \"kubernetes.io/projected/d1ce2849-824c-4c72-a86c-d0128e548d92-kube-api-access-mfs5w\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.696053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.696078 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.696552 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.696601 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.696630 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.696896 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697310 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697377 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697409 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697435 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-log-socket" (OuterVolumeSpecName: "log-socket") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697461 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-node-log" (OuterVolumeSpecName: "node-log") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697614 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697879 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697923 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-slash" (OuterVolumeSpecName: "host-slash") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.697990 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.698021 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.698323 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.702443 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.703705 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ee65e1-8667-4ad7-a403-c899f0cc6a70-kube-api-access-7hmb7" (OuterVolumeSpecName: "kube-api-access-7hmb7") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "kube-api-access-7hmb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.708937 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "49ee65e1-8667-4ad7-a403-c899f0cc6a70" (UID: "49ee65e1-8667-4ad7-a403-c899f0cc6a70"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797469 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-run-netns\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797518 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-var-lib-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797546 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797566 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-cni-netd\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797574 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-run-netns\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797658 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-systemd\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797687 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797689 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-cni-netd\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797608 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-systemd\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-slash\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-log-socket\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797817 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-ovn\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797849 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-env-overrides\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797885 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ce2849-824c-4c72-a86c-d0128e548d92-ovn-node-metrics-cert\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-ovnkube-script-lib\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797964 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-run-ovn-kubernetes\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797999 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798025 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-cni-bin\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-ovnkube-config\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-systemd-units\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798118 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfs5w\" (UniqueName: \"kubernetes.io/projected/d1ce2849-824c-4c72-a86c-d0128e548d92-kube-api-access-mfs5w\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798174 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-node-log\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798210 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-kubelet\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798237 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-etc-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798304 4907 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798320 4907 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798333 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798361 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hmb7\" (UniqueName: \"kubernetes.io/projected/49ee65e1-8667-4ad7-a403-c899f0cc6a70-kube-api-access-7hmb7\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798373 4907 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798384 4907 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798396 4907 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798406 4907 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798417 4907 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798428 4907 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798439 4907 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-log-socket\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798449 4907 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798473 4907 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/49ee65e1-8667-4ad7-a403-c899f0cc6a70-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798486 4907 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-node-log\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798497 4907 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-slash\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798509 4907 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798520 4907 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798534 4907 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798546 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/49ee65e1-8667-4ad7-a403-c899f0cc6a70-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798558 4907 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/49ee65e1-8667-4ad7-a403-c899f0cc6a70-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798615 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-etc-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.797657 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-var-lib-openvswitch\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798674 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-slash\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-log-socket\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.798734 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-run-ovn\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.799447 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-env-overrides\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.799945 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-run-ovn-kubernetes\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.799994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.800153 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-systemd-units\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.800157 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-cni-bin\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.800189 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-host-kubelet\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.800354 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-ovnkube-config\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.800358 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1ce2849-824c-4c72-a86c-d0128e548d92-node-log\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.800636 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1ce2849-824c-4c72-a86c-d0128e548d92-ovnkube-script-lib\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.803844 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1ce2849-824c-4c72-a86c-d0128e548d92-ovn-node-metrics-cert\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.819316 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfs5w\" (UniqueName: \"kubernetes.io/projected/d1ce2849-824c-4c72-a86c-d0128e548d92-kube-api-access-mfs5w\") pod \"ovnkube-node-m765d\" (UID: \"d1ce2849-824c-4c72-a86c-d0128e548d92\") " pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: I0226 15:56:25.976056 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:25 crc kubenswrapper[4907]: W0226 15:56:25.993229 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ce2849_824c_4c72_a86c_d0128e548d92.slice/crio-971364e40aed20f012b1a0952483c4807d071f30d3c3e5478687eeb2a8f3fda2 WatchSource:0}: Error finding container 971364e40aed20f012b1a0952483c4807d071f30d3c3e5478687eeb2a8f3fda2: Status 404 returned error can't find the container with id 971364e40aed20f012b1a0952483c4807d071f30d3c3e5478687eeb2a8f3fda2 Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.533630 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2gl5t_51024bd5-00ff-4e2f-927c-8c989b59d7be/kube-multus/2.log" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.533730 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2gl5t" event={"ID":"51024bd5-00ff-4e2f-927c-8c989b59d7be","Type":"ContainerStarted","Data":"a275f2d51b9d89d5c170f0e1bf65ade004b9b130bdb459f6e41834eb6a1a7544"} Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.539792 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovn-acl-logging/0.log" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.540858 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vsvsw_49ee65e1-8667-4ad7-a403-c899f0cc6a70/ovn-controller/0.log" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541389 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea" exitCode=0 Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541419 4907 generic.go:334] "Generic (PLEG): container finished" podID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" containerID="9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b" exitCode=0 Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541483 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea"} Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541509 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b"} Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541521 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" event={"ID":"49ee65e1-8667-4ad7-a403-c899f0cc6a70","Type":"ContainerDied","Data":"aaeb89d604bd4111c33ef90df0a3c2bc5e324f383e85eb567b6b409b8ed966d8"} Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541538 4907 scope.go:117] "RemoveContainer" containerID="117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.541804 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vsvsw" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.545896 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1ce2849-824c-4c72-a86c-d0128e548d92" containerID="732b68bda2bff15ccf67f1a7088f4b17341a01ecdcacda780e6e8cf0cbb86c01" exitCode=0 Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.546019 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerDied","Data":"732b68bda2bff15ccf67f1a7088f4b17341a01ecdcacda780e6e8cf0cbb86c01"} Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.546098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"971364e40aed20f012b1a0952483c4807d071f30d3c3e5478687eeb2a8f3fda2"} Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.573309 4907 scope.go:117] "RemoveContainer" containerID="cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.595308 4907 scope.go:117] "RemoveContainer" containerID="800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.645263 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsvsw"] Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.651516 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vsvsw"] Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.659282 4907 scope.go:117] "RemoveContainer" containerID="9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.680528 4907 scope.go:117] "RemoveContainer" containerID="67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.694511 4907 scope.go:117] "RemoveContainer" containerID="c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.707942 4907 scope.go:117] "RemoveContainer" containerID="17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.720671 4907 scope.go:117] "RemoveContainer" containerID="eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.732842 4907 scope.go:117] "RemoveContainer" containerID="b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.762671 4907 scope.go:117] "RemoveContainer" containerID="117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.763410 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3\": container with ID starting with 117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3 not found: ID does not exist" containerID="117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.763441 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3"} err="failed to get container status \"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3\": rpc error: code = NotFound desc = could not find container \"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3\": container with ID starting with 117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.763460 4907 scope.go:117] "RemoveContainer" containerID="cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.763929 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\": container with ID starting with cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df not found: ID does not exist" containerID="cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.763950 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df"} err="failed to get container status \"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\": rpc error: code = NotFound desc = could not find container \"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\": container with ID starting with cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.763962 4907 scope.go:117] "RemoveContainer" containerID="800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.764295 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\": container with ID starting with 800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea not found: ID does not exist" containerID="800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.764311 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea"} err="failed to get container status \"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\": rpc error: code = NotFound desc = could not find container \"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\": container with ID starting with 800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.764327 4907 scope.go:117] "RemoveContainer" containerID="9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.764542 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\": container with ID starting with 9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b not found: ID does not exist" containerID="9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.764559 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b"} err="failed to get container status \"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\": rpc error: code = NotFound desc = could not find container \"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\": container with ID starting with 9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.764570 4907 scope.go:117] "RemoveContainer" containerID="67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.764923 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\": container with ID starting with 67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921 not found: ID does not exist" containerID="67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.764948 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921"} err="failed to get container status \"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\": rpc error: code = NotFound desc = could not find container \"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\": container with ID starting with 67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.764960 4907 scope.go:117] "RemoveContainer" containerID="c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.765170 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\": container with ID starting with c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d not found: ID does not exist" containerID="c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765187 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d"} err="failed to get container status \"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\": rpc error: code = NotFound desc = could not find container \"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\": container with ID starting with c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765197 4907 scope.go:117] "RemoveContainer" containerID="17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.765362 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\": container with ID starting with 17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89 not found: ID does not exist" containerID="17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765377 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89"} err="failed to get container status \"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\": rpc error: code = NotFound desc = could not find container \"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\": container with ID starting with 17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765388 4907 scope.go:117] "RemoveContainer" containerID="eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.765561 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\": container with ID starting with eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487 not found: ID does not exist" containerID="eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765579 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487"} err="failed to get container status \"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\": rpc error: code = NotFound desc = could not find container \"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\": container with ID starting with eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765631 4907 scope.go:117] "RemoveContainer" containerID="b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c" Feb 26 15:56:26 crc kubenswrapper[4907]: E0226 15:56:26.765870 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\": container with ID starting with b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c not found: ID does not exist" containerID="b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765886 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c"} err="failed to get container status \"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\": rpc error: code = NotFound desc = could not find container \"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\": container with ID starting with b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.765897 4907 scope.go:117] "RemoveContainer" containerID="117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.766389 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3"} err="failed to get container status \"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3\": rpc error: code = NotFound desc = could not find container \"117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3\": container with ID starting with 117dc082982a8b3a3318c864792eff748b564107aeddf5f1ef19f61923a7e1d3 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.766405 4907 scope.go:117] "RemoveContainer" containerID="cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.766706 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df"} err="failed to get container status \"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\": rpc error: code = NotFound desc = could not find container \"cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df\": container with ID starting with cc2b19d04bf2ef1455fa049ed09ef927305f1ec89b19b42f39b0d8c1397f69df not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.766723 4907 scope.go:117] "RemoveContainer" containerID="800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.766909 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea"} err="failed to get container status \"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\": rpc error: code = NotFound desc = could not find container \"800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea\": container with ID starting with 800657f54374550b21f96594e9c9ce4e7dff28c5c09061192a95bb8a668ebbea not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.766926 4907 scope.go:117] "RemoveContainer" containerID="9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.767356 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b"} err="failed to get container status \"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\": rpc error: code = NotFound desc = could not find container \"9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b\": container with ID starting with 9e7470d80d872846d4d91e9070becfa3496dca8af1b315e637c34edce0dcd57b not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.767383 4907 scope.go:117] "RemoveContainer" containerID="67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.767551 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921"} err="failed to get container status \"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\": rpc error: code = NotFound desc = could not find container \"67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921\": container with ID starting with 67439cebe8e10e13db8af6bc74e152eb562382fb3b2f026ba3cbfe42e3b4c921 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.767566 4907 scope.go:117] "RemoveContainer" containerID="c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.767793 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d"} err="failed to get container status \"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\": rpc error: code = NotFound desc = could not find container \"c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d\": container with ID starting with c70ed6854442dfb329171dc5c454c036c020cb91e1f6595eb3fbe2d95704d52d not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.767811 4907 scope.go:117] "RemoveContainer" containerID="17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.768148 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89"} err="failed to get container status \"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\": rpc error: code = NotFound desc = could not find container \"17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89\": container with ID starting with 17760db3d112b908ad1389e3c28c244e756ef06ec2b4f170e4f52e17f9a75a89 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.768163 4907 scope.go:117] "RemoveContainer" containerID="eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.768346 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487"} err="failed to get container status \"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\": rpc error: code = NotFound desc = could not find container \"eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487\": container with ID starting with eca4b7a72754f7457c608969c5319a498c526ab128b28400d2aed5d0413ff487 not found: ID does not exist" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.768359 4907 scope.go:117] "RemoveContainer" containerID="b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c" Feb 26 15:56:26 crc kubenswrapper[4907]: I0226 15:56:26.768580 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c"} err="failed to get container status \"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\": rpc error: code = NotFound desc = could not find container \"b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c\": container with ID starting with b7621667d7c9c119893fe930093d4e1d2256a13aadc196023df28d1a78aef68c not found: ID does not exist" Feb 26 15:56:27 crc kubenswrapper[4907]: I0226 15:56:27.557014 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"0468bb4a86a03381007710e177a135da849b10ef87fc4b5b409667c34f42583b"} Feb 26 15:56:27 crc kubenswrapper[4907]: I0226 15:56:27.557320 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"cd8fa1742e9517e0969cc3bf7d57d2b0432987be911b083685c9e41b7c485f34"} Feb 26 15:56:27 crc kubenswrapper[4907]: I0226 15:56:27.557330 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"74fbb9bd2867653e8ba31ec044e8b22e21913a5376defaa6f1e8f24c27497a56"} Feb 26 15:56:27 crc kubenswrapper[4907]: I0226 15:56:27.557339 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"6a44300cb75e28bd44e5d0303f165f02dda70201bf54f31c0545dad4b501f553"} Feb 26 15:56:27 crc kubenswrapper[4907]: I0226 15:56:27.557348 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"a2e8572bbc1ec51abc0492bece521ff7584fb64454aa34ad43b8fa6ca94652a9"} Feb 26 15:56:27 crc kubenswrapper[4907]: I0226 15:56:27.557356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"292532642b3653ef57fefdc6ffe403d82d2d723a40db914db7d0b92085b298ca"} Feb 26 15:56:28 crc kubenswrapper[4907]: I0226 15:56:28.132974 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ee65e1-8667-4ad7-a403-c899f0cc6a70" path="/var/lib/kubelet/pods/49ee65e1-8667-4ad7-a403-c899f0cc6a70/volumes" Feb 26 15:56:29 crc kubenswrapper[4907]: I0226 15:56:29.573171 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"58cacdaabbfbd5871c0527767707595cd52fedb6bb027089070ced2c90ff0320"} Feb 26 15:56:30 crc kubenswrapper[4907]: I0226 15:56:30.366748 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hdhr9" Feb 26 15:56:32 crc kubenswrapper[4907]: I0226 15:56:32.598738 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" event={"ID":"d1ce2849-824c-4c72-a86c-d0128e548d92","Type":"ContainerStarted","Data":"36c3157f1b413b3d2cda74908c6192b560234c47b99ad52155a844edd1ee5aba"} Feb 26 15:56:32 crc kubenswrapper[4907]: I0226 15:56:32.599361 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:32 crc kubenswrapper[4907]: I0226 15:56:32.599382 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:32 crc kubenswrapper[4907]: I0226 15:56:32.637793 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" podStartSLOduration=7.637772459 podStartE2EDuration="7.637772459s" podCreationTimestamp="2026-02-26 15:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:56:32.629396603 +0000 UTC m=+855.147958492" watchObservedRunningTime="2026-02-26 15:56:32.637772459 +0000 UTC m=+855.156334308" Feb 26 15:56:32 crc kubenswrapper[4907]: I0226 15:56:32.648158 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:33 crc kubenswrapper[4907]: I0226 15:56:33.604290 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:33 crc kubenswrapper[4907]: I0226 15:56:33.636286 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:56:36 crc kubenswrapper[4907]: I0226 15:56:36.089693 4907 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.530314 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.530869 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.530923 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.531651 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"135e9e11cfbaabe55bbe34848f747e715822492af89a2d18c459beb482f280c0"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.531726 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://135e9e11cfbaabe55bbe34848f747e715822492af89a2d18c459beb482f280c0" gracePeriod=600 Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.704480 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="135e9e11cfbaabe55bbe34848f747e715822492af89a2d18c459beb482f280c0" exitCode=0 Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.704827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"135e9e11cfbaabe55bbe34848f747e715822492af89a2d18c459beb482f280c0"} Feb 26 15:56:48 crc kubenswrapper[4907]: I0226 15:56:48.704911 4907 scope.go:117] "RemoveContainer" containerID="53be863c74815dd43aa6d07eb234f8fc9300124de620faba3fc31d92226518b6" Feb 26 15:56:49 crc kubenswrapper[4907]: I0226 15:56:49.714928 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"9e579d2506f44ad3d5c29d72a7fa0d983bb32b89f28c090014c2276378479cce"} Feb 26 15:56:55 crc kubenswrapper[4907]: I0226 15:56:55.999265 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m765d" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.263879 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh"] Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.265419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.276799 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.279359 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh"] Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.447353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhtsq\" (UniqueName: \"kubernetes.io/projected/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-kube-api-access-lhtsq\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.447502 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.447538 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.548456 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.548540 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.548773 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhtsq\" (UniqueName: \"kubernetes.io/projected/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-kube-api-access-lhtsq\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.549249 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.549264 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.581511 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhtsq\" (UniqueName: \"kubernetes.io/projected/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-kube-api-access-lhtsq\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:09 crc kubenswrapper[4907]: I0226 15:57:09.581867 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:10 crc kubenswrapper[4907]: I0226 15:57:10.027158 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh"] Feb 26 15:57:10 crc kubenswrapper[4907]: W0226 15:57:10.037685 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9bc1ab0_f219_4ba0_adc8_07a7167bbaa0.slice/crio-2cf7538f3c68962e8c2dc7bb2465cac8166dcf08b94b4fa4b55adb101d405cf4 WatchSource:0}: Error finding container 2cf7538f3c68962e8c2dc7bb2465cac8166dcf08b94b4fa4b55adb101d405cf4: Status 404 returned error can't find the container with id 2cf7538f3c68962e8c2dc7bb2465cac8166dcf08b94b4fa4b55adb101d405cf4 Feb 26 15:57:10 crc kubenswrapper[4907]: I0226 15:57:10.850308 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerID="68e2f538d95d53c9899f7fef35f03ef26f07ac55bf73a4ad8e3c596f5c2dd044" exitCode=0 Feb 26 15:57:10 crc kubenswrapper[4907]: I0226 15:57:10.850384 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" event={"ID":"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0","Type":"ContainerDied","Data":"68e2f538d95d53c9899f7fef35f03ef26f07ac55bf73a4ad8e3c596f5c2dd044"} Feb 26 15:57:10 crc kubenswrapper[4907]: I0226 15:57:10.850746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" event={"ID":"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0","Type":"ContainerStarted","Data":"2cf7538f3c68962e8c2dc7bb2465cac8166dcf08b94b4fa4b55adb101d405cf4"} Feb 26 15:57:10 crc kubenswrapper[4907]: I0226 15:57:10.854944 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.572069 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nmbst"] Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.573332 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.579317 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmbst"] Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.673334 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvkb5\" (UniqueName: \"kubernetes.io/projected/a230e9ba-409c-4093-99d2-1a897eadfbaa-kube-api-access-gvkb5\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.673616 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-catalog-content\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.673798 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-utilities\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.775034 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-utilities\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.775110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvkb5\" (UniqueName: \"kubernetes.io/projected/a230e9ba-409c-4093-99d2-1a897eadfbaa-kube-api-access-gvkb5\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.775144 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-catalog-content\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.775777 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-catalog-content\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.776106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-utilities\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.794378 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvkb5\" (UniqueName: \"kubernetes.io/projected/a230e9ba-409c-4093-99d2-1a897eadfbaa-kube-api-access-gvkb5\") pod \"redhat-operators-nmbst\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:11 crc kubenswrapper[4907]: I0226 15:57:11.902827 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:12 crc kubenswrapper[4907]: I0226 15:57:12.149673 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nmbst"] Feb 26 15:57:12 crc kubenswrapper[4907]: I0226 15:57:12.865284 4907 generic.go:334] "Generic (PLEG): container finished" podID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerID="d66826570cd73a564f034822d0feb2b934b836c3983a8bd7e9c7d177126f9e54" exitCode=0 Feb 26 15:57:12 crc kubenswrapper[4907]: I0226 15:57:12.865456 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerDied","Data":"d66826570cd73a564f034822d0feb2b934b836c3983a8bd7e9c7d177126f9e54"} Feb 26 15:57:12 crc kubenswrapper[4907]: I0226 15:57:12.865640 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerStarted","Data":"bb4deed55eb769d74d271be2cb6635d3ac4e52151a370e1025b9b3febbdad45b"} Feb 26 15:57:12 crc kubenswrapper[4907]: I0226 15:57:12.869025 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerID="4f16abbd92bad20b3a04ac3c60f99977164e7f617ec02e6f6ab2c72a7c22c194" exitCode=0 Feb 26 15:57:12 crc kubenswrapper[4907]: I0226 15:57:12.869068 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" event={"ID":"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0","Type":"ContainerDied","Data":"4f16abbd92bad20b3a04ac3c60f99977164e7f617ec02e6f6ab2c72a7c22c194"} Feb 26 15:57:13 crc kubenswrapper[4907]: I0226 15:57:13.876464 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerID="5bf635b963385a09aef5f604fd234bb69d2fc2d0a11a656a777695c87a10e0b7" exitCode=0 Feb 26 15:57:13 crc kubenswrapper[4907]: I0226 15:57:13.876509 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" event={"ID":"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0","Type":"ContainerDied","Data":"5bf635b963385a09aef5f604fd234bb69d2fc2d0a11a656a777695c87a10e0b7"} Feb 26 15:57:14 crc kubenswrapper[4907]: I0226 15:57:14.885203 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerStarted","Data":"77c472c89b2229b8c0f14e0e6d68f27a4a400235513b73a5553353a336cf4c5f"} Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.113892 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.132715 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-bundle\") pod \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.132782 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhtsq\" (UniqueName: \"kubernetes.io/projected/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-kube-api-access-lhtsq\") pod \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.132879 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-util\") pod \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\" (UID: \"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0\") " Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.133489 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-bundle" (OuterVolumeSpecName: "bundle") pod "d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" (UID: "d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.139905 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-kube-api-access-lhtsq" (OuterVolumeSpecName: "kube-api-access-lhtsq") pod "d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" (UID: "d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0"). InnerVolumeSpecName "kube-api-access-lhtsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.165725 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-util" (OuterVolumeSpecName: "util") pod "d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" (UID: "d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.234419 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.234452 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhtsq\" (UniqueName: \"kubernetes.io/projected/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-kube-api-access-lhtsq\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.234462 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.894131 4907 generic.go:334] "Generic (PLEG): container finished" podID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerID="77c472c89b2229b8c0f14e0e6d68f27a4a400235513b73a5553353a336cf4c5f" exitCode=0 Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.894194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerDied","Data":"77c472c89b2229b8c0f14e0e6d68f27a4a400235513b73a5553353a336cf4c5f"} Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.897029 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" event={"ID":"d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0","Type":"ContainerDied","Data":"2cf7538f3c68962e8c2dc7bb2465cac8166dcf08b94b4fa4b55adb101d405cf4"} Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.897067 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cf7538f3c68962e8c2dc7bb2465cac8166dcf08b94b4fa4b55adb101d405cf4" Feb 26 15:57:15 crc kubenswrapper[4907]: I0226 15:57:15.897091 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh" Feb 26 15:57:17 crc kubenswrapper[4907]: I0226 15:57:17.909448 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerStarted","Data":"64341ead822501e8c3c2636b72353229daa43e43f5cdd51aefb6b0b5c42c1767"} Feb 26 15:57:17 crc kubenswrapper[4907]: I0226 15:57:17.930050 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nmbst" podStartSLOduration=2.585240543 podStartE2EDuration="6.930033618s" podCreationTimestamp="2026-02-26 15:57:11 +0000 UTC" firstStartedPulling="2026-02-26 15:57:12.867094161 +0000 UTC m=+895.385656010" lastFinishedPulling="2026-02-26 15:57:17.211887196 +0000 UTC m=+899.730449085" observedRunningTime="2026-02-26 15:57:17.927625326 +0000 UTC m=+900.446187215" watchObservedRunningTime="2026-02-26 15:57:17.930033618 +0000 UTC m=+900.448595467" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.883324 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x"] Feb 26 15:57:19 crc kubenswrapper[4907]: E0226 15:57:19.883586 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="util" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.883620 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="util" Feb 26 15:57:19 crc kubenswrapper[4907]: E0226 15:57:19.883636 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="extract" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.883645 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="extract" Feb 26 15:57:19 crc kubenswrapper[4907]: E0226 15:57:19.883658 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="pull" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.883665 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="pull" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.883804 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0" containerName="extract" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.884290 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.887303 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.887337 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.890065 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wrdgb" Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.896677 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x"] Feb 26 15:57:19 crc kubenswrapper[4907]: I0226 15:57:19.994310 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqmt\" (UniqueName: \"kubernetes.io/projected/383794c0-581b-4b48-bf74-876cfe097c2e-kube-api-access-jbqmt\") pod \"nmstate-operator-75c5dccd6c-hpw5x\" (UID: \"383794c0-581b-4b48-bf74-876cfe097c2e\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" Feb 26 15:57:20 crc kubenswrapper[4907]: I0226 15:57:20.098662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqmt\" (UniqueName: \"kubernetes.io/projected/383794c0-581b-4b48-bf74-876cfe097c2e-kube-api-access-jbqmt\") pod \"nmstate-operator-75c5dccd6c-hpw5x\" (UID: \"383794c0-581b-4b48-bf74-876cfe097c2e\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" Feb 26 15:57:20 crc kubenswrapper[4907]: I0226 15:57:20.128717 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqmt\" (UniqueName: \"kubernetes.io/projected/383794c0-581b-4b48-bf74-876cfe097c2e-kube-api-access-jbqmt\") pod \"nmstate-operator-75c5dccd6c-hpw5x\" (UID: \"383794c0-581b-4b48-bf74-876cfe097c2e\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" Feb 26 15:57:20 crc kubenswrapper[4907]: I0226 15:57:20.198527 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" Feb 26 15:57:20 crc kubenswrapper[4907]: I0226 15:57:20.681134 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x"] Feb 26 15:57:20 crc kubenswrapper[4907]: W0226 15:57:20.689979 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod383794c0_581b_4b48_bf74_876cfe097c2e.slice/crio-1f47a8cca72dbeebb33ac8b595c8348def7eaafccb669007c30f4b6956a1ea4d WatchSource:0}: Error finding container 1f47a8cca72dbeebb33ac8b595c8348def7eaafccb669007c30f4b6956a1ea4d: Status 404 returned error can't find the container with id 1f47a8cca72dbeebb33ac8b595c8348def7eaafccb669007c30f4b6956a1ea4d Feb 26 15:57:20 crc kubenswrapper[4907]: I0226 15:57:20.929508 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" event={"ID":"383794c0-581b-4b48-bf74-876cfe097c2e","Type":"ContainerStarted","Data":"1f47a8cca72dbeebb33ac8b595c8348def7eaafccb669007c30f4b6956a1ea4d"} Feb 26 15:57:21 crc kubenswrapper[4907]: I0226 15:57:21.903257 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:21 crc kubenswrapper[4907]: I0226 15:57:21.903410 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:22 crc kubenswrapper[4907]: I0226 15:57:22.955225 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nmbst" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="registry-server" probeResult="failure" output=< Feb 26 15:57:22 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 15:57:22 crc kubenswrapper[4907]: > Feb 26 15:57:26 crc kubenswrapper[4907]: I0226 15:57:26.978477 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" event={"ID":"383794c0-581b-4b48-bf74-876cfe097c2e","Type":"ContainerStarted","Data":"af4f2b1c0c1fb0ff0e1f30490c617063df71c07c32d14d06b43695eef3a30978"} Feb 26 15:57:27 crc kubenswrapper[4907]: I0226 15:57:27.004842 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-hpw5x" podStartSLOduration=2.312282273 podStartE2EDuration="8.004819867s" podCreationTimestamp="2026-02-26 15:57:19 +0000 UTC" firstStartedPulling="2026-02-26 15:57:20.692262209 +0000 UTC m=+903.210824058" lastFinishedPulling="2026-02-26 15:57:26.384799793 +0000 UTC m=+908.903361652" observedRunningTime="2026-02-26 15:57:26.995095391 +0000 UTC m=+909.513657260" watchObservedRunningTime="2026-02-26 15:57:27.004819867 +0000 UTC m=+909.523381726" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.651188 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-fpz9z"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.652517 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.656601 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-qzcmt" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.665371 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-fpz9z"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.679137 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-4cghj"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.680129 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.683330 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.690394 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-4cghj"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.697212 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-5zh9k"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.698038 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.714724 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpw8w\" (UniqueName: \"kubernetes.io/projected/ad43a6fa-206d-43e4-8364-7902ff853e8c-kube-api-access-lpw8w\") pod \"nmstate-metrics-69594cc75-fpz9z\" (UID: \"ad43a6fa-206d-43e4-8364-7902ff853e8c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816432 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-nmstate-lock\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816492 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpw8w\" (UniqueName: \"kubernetes.io/projected/ad43a6fa-206d-43e4-8364-7902ff853e8c-kube-api-access-lpw8w\") pod \"nmstate-metrics-69594cc75-fpz9z\" (UID: \"ad43a6fa-206d-43e4-8364-7902ff853e8c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816545 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-dbus-socket\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816616 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n895l\" (UniqueName: \"kubernetes.io/projected/aae13e12-a0b1-40c1-bdd6-844b790cb79c-kube-api-access-n895l\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aae13e12-a0b1-40c1-bdd6-844b790cb79c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816762 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-ovs-socket\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.816852 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9phts\" (UniqueName: \"kubernetes.io/projected/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-kube-api-access-9phts\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.819101 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.819891 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.823890 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.823959 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cx6ls" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.823968 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.835719 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5"] Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.851764 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpw8w\" (UniqueName: \"kubernetes.io/projected/ad43a6fa-206d-43e4-8364-7902ff853e8c-kube-api-access-lpw8w\") pod \"nmstate-metrics-69594cc75-fpz9z\" (UID: \"ad43a6fa-206d-43e4-8364-7902ff853e8c\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.917869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9phts\" (UniqueName: \"kubernetes.io/projected/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-kube-api-access-9phts\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918326 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c973ae22-7363-4e9d-abbe-a519875d412c-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918391 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-nmstate-lock\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918455 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-dbus-socket\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918517 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n895l\" (UniqueName: \"kubernetes.io/projected/aae13e12-a0b1-40c1-bdd6-844b790cb79c-kube-api-access-n895l\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aae13e12-a0b1-40c1-bdd6-844b790cb79c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918580 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-nmstate-lock\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.918823 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-dbus-socket\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: E0226 15:57:29.919055 4907 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.919130 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c973ae22-7363-4e9d-abbe-a519875d412c-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.919158 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6chv\" (UniqueName: \"kubernetes.io/projected/c973ae22-7363-4e9d-abbe-a519875d412c-kube-api-access-d6chv\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.919180 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-ovs-socket\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.919241 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-ovs-socket\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: E0226 15:57:29.919277 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aae13e12-a0b1-40c1-bdd6-844b790cb79c-tls-key-pair podName:aae13e12-a0b1-40c1-bdd6-844b790cb79c nodeName:}" failed. No retries permitted until 2026-02-26 15:57:30.419257441 +0000 UTC m=+912.937819290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/aae13e12-a0b1-40c1-bdd6-844b790cb79c-tls-key-pair") pod "nmstate-webhook-786f45cff4-4cghj" (UID: "aae13e12-a0b1-40c1-bdd6-844b790cb79c") : secret "openshift-nmstate-webhook" not found Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.940820 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9phts\" (UniqueName: \"kubernetes.io/projected/3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097-kube-api-access-9phts\") pod \"nmstate-handler-5zh9k\" (UID: \"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097\") " pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.943706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n895l\" (UniqueName: \"kubernetes.io/projected/aae13e12-a0b1-40c1-bdd6-844b790cb79c-kube-api-access-n895l\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:29 crc kubenswrapper[4907]: I0226 15:57:29.970580 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.015624 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.020071 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c973ae22-7363-4e9d-abbe-a519875d412c-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.020120 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6chv\" (UniqueName: \"kubernetes.io/projected/c973ae22-7363-4e9d-abbe-a519875d412c-kube-api-access-d6chv\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.020159 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c973ae22-7363-4e9d-abbe-a519875d412c-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.021197 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c973ae22-7363-4e9d-abbe-a519875d412c-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.026263 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c973ae22-7363-4e9d-abbe-a519875d412c-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.038360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6chv\" (UniqueName: \"kubernetes.io/projected/c973ae22-7363-4e9d-abbe-a519875d412c-kube-api-access-d6chv\") pod \"nmstate-console-plugin-5dcbbd79cf-85mn5\" (UID: \"c973ae22-7363-4e9d-abbe-a519875d412c\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.075265 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77fcfc7895-pqx99"] Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.076199 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.086123 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77fcfc7895-pqx99"] Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.138827 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-serving-cert\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228313 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-config\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228332 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-service-ca\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228347 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-oauth-serving-cert\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228383 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gp7c\" (UniqueName: \"kubernetes.io/projected/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-kube-api-access-7gp7c\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228396 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-trusted-ca-bundle\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.228418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-oauth-config\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.302375 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-fpz9z"] Feb 26 15:57:30 crc kubenswrapper[4907]: W0226 15:57:30.306924 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad43a6fa_206d_43e4_8364_7902ff853e8c.slice/crio-61a57784300901f79b4f601b47eb3a42f4dfb1afb561fee2cf305bb5dbc17494 WatchSource:0}: Error finding container 61a57784300901f79b4f601b47eb3a42f4dfb1afb561fee2cf305bb5dbc17494: Status 404 returned error can't find the container with id 61a57784300901f79b4f601b47eb3a42f4dfb1afb561fee2cf305bb5dbc17494 Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.329883 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-serving-cert\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.329961 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-config\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.329983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-service-ca\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.329998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-oauth-serving-cert\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.330031 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-trusted-ca-bundle\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.330048 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gp7c\" (UniqueName: \"kubernetes.io/projected/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-kube-api-access-7gp7c\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.330071 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-oauth-config\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.332100 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-service-ca\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.335244 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-oauth-config\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.336648 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-oauth-serving-cert\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.337244 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-config\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.338162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-trusted-ca-bundle\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.340361 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-console-serving-cert\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.351465 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gp7c\" (UniqueName: \"kubernetes.io/projected/2d809bdd-7e10-4838-89f5-6d1b5cfab7f7-kube-api-access-7gp7c\") pod \"console-77fcfc7895-pqx99\" (UID: \"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7\") " pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.372423 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5"] Feb 26 15:57:30 crc kubenswrapper[4907]: W0226 15:57:30.375454 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc973ae22_7363_4e9d_abbe_a519875d412c.slice/crio-efb6bd192b763de0fb6e9c6f47b736f20ea7f012b6a881d4579dd9743203d890 WatchSource:0}: Error finding container efb6bd192b763de0fb6e9c6f47b736f20ea7f012b6a881d4579dd9743203d890: Status 404 returned error can't find the container with id efb6bd192b763de0fb6e9c6f47b736f20ea7f012b6a881d4579dd9743203d890 Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.396259 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.431644 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aae13e12-a0b1-40c1-bdd6-844b790cb79c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.435489 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aae13e12-a0b1-40c1-bdd6-844b790cb79c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-4cghj\" (UID: \"aae13e12-a0b1-40c1-bdd6-844b790cb79c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.602367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.611504 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77fcfc7895-pqx99"] Feb 26 15:57:30 crc kubenswrapper[4907]: I0226 15:57:30.840495 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-4cghj"] Feb 26 15:57:30 crc kubenswrapper[4907]: W0226 15:57:30.864697 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaae13e12_a0b1_40c1_bdd6_844b790cb79c.slice/crio-b0d47596dc1e6351d9365a4eb40bd62f5fa00ab1a8570a4bd816f602cf234a12 WatchSource:0}: Error finding container b0d47596dc1e6351d9365a4eb40bd62f5fa00ab1a8570a4bd816f602cf234a12: Status 404 returned error can't find the container with id b0d47596dc1e6351d9365a4eb40bd62f5fa00ab1a8570a4bd816f602cf234a12 Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:30.999983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fcfc7895-pqx99" event={"ID":"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7","Type":"ContainerStarted","Data":"1be8f739554b09a0f8a512eb0bf04df4ce54bd05460ba3982b7115c0293feb1b"} Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.000022 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77fcfc7895-pqx99" event={"ID":"2d809bdd-7e10-4838-89f5-6d1b5cfab7f7","Type":"ContainerStarted","Data":"2ba1c3548d98e124637d11cd54caff57525f130c14d8bd01d7c327c9ebe8e3dc"} Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.001740 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" event={"ID":"ad43a6fa-206d-43e4-8364-7902ff853e8c","Type":"ContainerStarted","Data":"61a57784300901f79b4f601b47eb3a42f4dfb1afb561fee2cf305bb5dbc17494"} Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.003503 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5zh9k" event={"ID":"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097","Type":"ContainerStarted","Data":"12c10aa812606b67143e69eea62997a1949c0bee385e7b7e1c2b077f611ec1c8"} Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.005725 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" event={"ID":"c973ae22-7363-4e9d-abbe-a519875d412c","Type":"ContainerStarted","Data":"efb6bd192b763de0fb6e9c6f47b736f20ea7f012b6a881d4579dd9743203d890"} Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.006774 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" event={"ID":"aae13e12-a0b1-40c1-bdd6-844b790cb79c","Type":"ContainerStarted","Data":"b0d47596dc1e6351d9365a4eb40bd62f5fa00ab1a8570a4bd816f602cf234a12"} Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.017616 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77fcfc7895-pqx99" podStartSLOduration=1.017600882 podStartE2EDuration="1.017600882s" podCreationTimestamp="2026-02-26 15:57:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:57:31.016066015 +0000 UTC m=+913.534627864" watchObservedRunningTime="2026-02-26 15:57:31.017600882 +0000 UTC m=+913.536162731" Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.944795 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:31 crc kubenswrapper[4907]: I0226 15:57:31.992133 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:32 crc kubenswrapper[4907]: I0226 15:57:32.176161 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmbst"] Feb 26 15:57:33 crc kubenswrapper[4907]: I0226 15:57:33.019779 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nmbst" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="registry-server" containerID="cri-o://64341ead822501e8c3c2636b72353229daa43e43f5cdd51aefb6b0b5c42c1767" gracePeriod=2 Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.026679 4907 generic.go:334] "Generic (PLEG): container finished" podID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerID="64341ead822501e8c3c2636b72353229daa43e43f5cdd51aefb6b0b5c42c1767" exitCode=0 Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.026761 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerDied","Data":"64341ead822501e8c3c2636b72353229daa43e43f5cdd51aefb6b0b5c42c1767"} Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.247239 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.384818 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-utilities\") pod \"a230e9ba-409c-4093-99d2-1a897eadfbaa\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.384871 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvkb5\" (UniqueName: \"kubernetes.io/projected/a230e9ba-409c-4093-99d2-1a897eadfbaa-kube-api-access-gvkb5\") pod \"a230e9ba-409c-4093-99d2-1a897eadfbaa\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.384989 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-catalog-content\") pod \"a230e9ba-409c-4093-99d2-1a897eadfbaa\" (UID: \"a230e9ba-409c-4093-99d2-1a897eadfbaa\") " Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.385648 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-utilities" (OuterVolumeSpecName: "utilities") pod "a230e9ba-409c-4093-99d2-1a897eadfbaa" (UID: "a230e9ba-409c-4093-99d2-1a897eadfbaa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.392508 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a230e9ba-409c-4093-99d2-1a897eadfbaa-kube-api-access-gvkb5" (OuterVolumeSpecName: "kube-api-access-gvkb5") pod "a230e9ba-409c-4093-99d2-1a897eadfbaa" (UID: "a230e9ba-409c-4093-99d2-1a897eadfbaa"). InnerVolumeSpecName "kube-api-access-gvkb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.488128 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.488176 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvkb5\" (UniqueName: \"kubernetes.io/projected/a230e9ba-409c-4093-99d2-1a897eadfbaa-kube-api-access-gvkb5\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.515424 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a230e9ba-409c-4093-99d2-1a897eadfbaa" (UID: "a230e9ba-409c-4093-99d2-1a897eadfbaa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:57:34 crc kubenswrapper[4907]: I0226 15:57:34.593772 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a230e9ba-409c-4093-99d2-1a897eadfbaa-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:57:35 crc kubenswrapper[4907]: I0226 15:57:35.671852 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nmbst" event={"ID":"a230e9ba-409c-4093-99d2-1a897eadfbaa","Type":"ContainerDied","Data":"bb4deed55eb769d74d271be2cb6635d3ac4e52151a370e1025b9b3febbdad45b"} Feb 26 15:57:35 crc kubenswrapper[4907]: I0226 15:57:35.671945 4907 scope.go:117] "RemoveContainer" containerID="64341ead822501e8c3c2636b72353229daa43e43f5cdd51aefb6b0b5c42c1767" Feb 26 15:57:35 crc kubenswrapper[4907]: I0226 15:57:35.672166 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nmbst" Feb 26 15:57:35 crc kubenswrapper[4907]: I0226 15:57:35.726043 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nmbst"] Feb 26 15:57:35 crc kubenswrapper[4907]: I0226 15:57:35.734245 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nmbst"] Feb 26 15:57:36 crc kubenswrapper[4907]: I0226 15:57:36.137887 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" path="/var/lib/kubelet/pods/a230e9ba-409c-4093-99d2-1a897eadfbaa/volumes" Feb 26 15:57:38 crc kubenswrapper[4907]: I0226 15:57:38.649842 4907 scope.go:117] "RemoveContainer" containerID="77c472c89b2229b8c0f14e0e6d68f27a4a400235513b73a5553353a336cf4c5f" Feb 26 15:57:38 crc kubenswrapper[4907]: I0226 15:57:38.765885 4907 scope.go:117] "RemoveContainer" containerID="d66826570cd73a564f034822d0feb2b934b836c3983a8bd7e9c7d177126f9e54" Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.704979 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" event={"ID":"c973ae22-7363-4e9d-abbe-a519875d412c","Type":"ContainerStarted","Data":"6fb8cbab5b03f420fdc2c6dcda1a6b21ae7c8e667a38acca5d96b7695c87d1af"} Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.708129 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" event={"ID":"aae13e12-a0b1-40c1-bdd6-844b790cb79c","Type":"ContainerStarted","Data":"84f38384c98eb9bde12ff35531f669cdaf80a049c2acdb2e340b7419e5476791"} Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.708624 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.710913 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" event={"ID":"ad43a6fa-206d-43e4-8364-7902ff853e8c","Type":"ContainerStarted","Data":"7f7f59adc87931c679723690496d806f735ac3cba6c468f0e105fb7880354e80"} Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.711924 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-5zh9k" event={"ID":"3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097","Type":"ContainerStarted","Data":"e259906ba631730abcac3a88d883390d8db66f9d9d3d43d705935fd7d350ca99"} Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.712126 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.731681 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-85mn5" podStartSLOduration=2.22077544 podStartE2EDuration="10.731660306s" podCreationTimestamp="2026-02-26 15:57:29 +0000 UTC" firstStartedPulling="2026-02-26 15:57:30.378000193 +0000 UTC m=+912.896562042" lastFinishedPulling="2026-02-26 15:57:38.888885019 +0000 UTC m=+921.407446908" observedRunningTime="2026-02-26 15:57:39.7214851 +0000 UTC m=+922.240046959" watchObservedRunningTime="2026-02-26 15:57:39.731660306 +0000 UTC m=+922.250222165" Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.764984 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" podStartSLOduration=2.744236158 podStartE2EDuration="10.764963472s" podCreationTimestamp="2026-02-26 15:57:29 +0000 UTC" firstStartedPulling="2026-02-26 15:57:30.869323583 +0000 UTC m=+913.387885432" lastFinishedPulling="2026-02-26 15:57:38.890050887 +0000 UTC m=+921.408612746" observedRunningTime="2026-02-26 15:57:39.760987876 +0000 UTC m=+922.279549725" watchObservedRunningTime="2026-02-26 15:57:39.764963472 +0000 UTC m=+922.283525321" Feb 26 15:57:39 crc kubenswrapper[4907]: I0226 15:57:39.786944 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-5zh9k" podStartSLOduration=1.948377698 podStartE2EDuration="10.786924723s" podCreationTimestamp="2026-02-26 15:57:29 +0000 UTC" firstStartedPulling="2026-02-26 15:57:30.050738683 +0000 UTC m=+912.569300532" lastFinishedPulling="2026-02-26 15:57:38.889285698 +0000 UTC m=+921.407847557" observedRunningTime="2026-02-26 15:57:39.783579032 +0000 UTC m=+922.302140911" watchObservedRunningTime="2026-02-26 15:57:39.786924723 +0000 UTC m=+922.305486582" Feb 26 15:57:40 crc kubenswrapper[4907]: I0226 15:57:40.396646 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:40 crc kubenswrapper[4907]: I0226 15:57:40.397158 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:40 crc kubenswrapper[4907]: I0226 15:57:40.401963 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:40 crc kubenswrapper[4907]: I0226 15:57:40.726464 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77fcfc7895-pqx99" Feb 26 15:57:40 crc kubenswrapper[4907]: I0226 15:57:40.791278 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9lx5z"] Feb 26 15:57:44 crc kubenswrapper[4907]: I0226 15:57:44.752493 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" event={"ID":"ad43a6fa-206d-43e4-8364-7902ff853e8c","Type":"ContainerStarted","Data":"5b5dc2536a4f9c842fc1895ef31e1f237ab558f3e9fd4051910acb4cdd2db625"} Feb 26 15:57:44 crc kubenswrapper[4907]: I0226 15:57:44.775395 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-fpz9z" podStartSLOduration=2.118372891 podStartE2EDuration="15.775361291s" podCreationTimestamp="2026-02-26 15:57:29 +0000 UTC" firstStartedPulling="2026-02-26 15:57:30.309712661 +0000 UTC m=+912.828274510" lastFinishedPulling="2026-02-26 15:57:43.966701061 +0000 UTC m=+926.485262910" observedRunningTime="2026-02-26 15:57:44.771007546 +0000 UTC m=+927.289569435" watchObservedRunningTime="2026-02-26 15:57:44.775361291 +0000 UTC m=+927.293923210" Feb 26 15:57:45 crc kubenswrapper[4907]: I0226 15:57:45.051313 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-5zh9k" Feb 26 15:57:50 crc kubenswrapper[4907]: I0226 15:57:50.609301 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-4cghj" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.136969 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535358-mk4kx"] Feb 26 15:58:00 crc kubenswrapper[4907]: E0226 15:58:00.137923 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="extract-utilities" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.137940 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="extract-utilities" Feb 26 15:58:00 crc kubenswrapper[4907]: E0226 15:58:00.137957 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="registry-server" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.137965 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="registry-server" Feb 26 15:58:00 crc kubenswrapper[4907]: E0226 15:58:00.137986 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="extract-content" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.137993 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="extract-content" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.138119 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a230e9ba-409c-4093-99d2-1a897eadfbaa" containerName="registry-server" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.138639 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.141072 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.146529 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.146808 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.150996 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-mk4kx"] Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.230229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbd8\" (UniqueName: \"kubernetes.io/projected/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5-kube-api-access-ddbd8\") pod \"auto-csr-approver-29535358-mk4kx\" (UID: \"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5\") " pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.331080 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbd8\" (UniqueName: \"kubernetes.io/projected/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5-kube-api-access-ddbd8\") pod \"auto-csr-approver-29535358-mk4kx\" (UID: \"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5\") " pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.357860 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbd8\" (UniqueName: \"kubernetes.io/projected/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5-kube-api-access-ddbd8\") pod \"auto-csr-approver-29535358-mk4kx\" (UID: \"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5\") " pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.463652 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:00 crc kubenswrapper[4907]: I0226 15:58:00.862563 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-mk4kx"] Feb 26 15:58:01 crc kubenswrapper[4907]: I0226 15:58:01.878372 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" event={"ID":"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5","Type":"ContainerStarted","Data":"8f3948e7db069cbfa270447761b9c2aa7a6eda146d9997f8ec3364b5b219502b"} Feb 26 15:58:02 crc kubenswrapper[4907]: I0226 15:58:02.884804 4907 generic.go:334] "Generic (PLEG): container finished" podID="ac3d86bc-1eb6-4d67-a762-2000e20fcbd5" containerID="4f97dc5b43bb9e39af17c85fe883c1f94bba5cd5baf28e733e55dd9e924078b1" exitCode=0 Feb 26 15:58:02 crc kubenswrapper[4907]: I0226 15:58:02.884903 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" event={"ID":"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5","Type":"ContainerDied","Data":"4f97dc5b43bb9e39af17c85fe883c1f94bba5cd5baf28e733e55dd9e924078b1"} Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.487439 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t"] Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.488546 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.490783 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.497418 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t"] Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.574865 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.574901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.574940 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghrs\" (UniqueName: \"kubernetes.io/projected/dea9effb-0863-442e-85b0-ac5bade13bdb-kube-api-access-sghrs\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.675959 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.676026 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.676090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sghrs\" (UniqueName: \"kubernetes.io/projected/dea9effb-0863-442e-85b0-ac5bade13bdb-kube-api-access-sghrs\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.677032 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.677071 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.697475 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sghrs\" (UniqueName: \"kubernetes.io/projected/dea9effb-0863-442e-85b0-ac5bade13bdb-kube-api-access-sghrs\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:03 crc kubenswrapper[4907]: I0226 15:58:03.811320 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.169718 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.241512 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t"] Feb 26 15:58:04 crc kubenswrapper[4907]: W0226 15:58:04.249470 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea9effb_0863_442e_85b0_ac5bade13bdb.slice/crio-588a8455c32141e910f1343b9d67991d40508063be777e98c22b60fc1515308b WatchSource:0}: Error finding container 588a8455c32141e910f1343b9d67991d40508063be777e98c22b60fc1515308b: Status 404 returned error can't find the container with id 588a8455c32141e910f1343b9d67991d40508063be777e98c22b60fc1515308b Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.285846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddbd8\" (UniqueName: \"kubernetes.io/projected/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5-kube-api-access-ddbd8\") pod \"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5\" (UID: \"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5\") " Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.289060 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5-kube-api-access-ddbd8" (OuterVolumeSpecName: "kube-api-access-ddbd8") pod "ac3d86bc-1eb6-4d67-a762-2000e20fcbd5" (UID: "ac3d86bc-1eb6-4d67-a762-2000e20fcbd5"). InnerVolumeSpecName "kube-api-access-ddbd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.388188 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddbd8\" (UniqueName: \"kubernetes.io/projected/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5-kube-api-access-ddbd8\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.898378 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" event={"ID":"ac3d86bc-1eb6-4d67-a762-2000e20fcbd5","Type":"ContainerDied","Data":"8f3948e7db069cbfa270447761b9c2aa7a6eda146d9997f8ec3364b5b219502b"} Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.898971 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f3948e7db069cbfa270447761b9c2aa7a6eda146d9997f8ec3364b5b219502b" Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.898411 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535358-mk4kx" Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.901277 4907 generic.go:334] "Generic (PLEG): container finished" podID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerID="25d73267604820779f437020e0ff3dc67348d837a86ce56c1f081d57a46bf869" exitCode=0 Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.901318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" event={"ID":"dea9effb-0863-442e-85b0-ac5bade13bdb","Type":"ContainerDied","Data":"25d73267604820779f437020e0ff3dc67348d837a86ce56c1f081d57a46bf869"} Feb 26 15:58:04 crc kubenswrapper[4907]: I0226 15:58:04.901347 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" event={"ID":"dea9effb-0863-442e-85b0-ac5bade13bdb","Type":"ContainerStarted","Data":"588a8455c32141e910f1343b9d67991d40508063be777e98c22b60fc1515308b"} Feb 26 15:58:05 crc kubenswrapper[4907]: I0226 15:58:05.226044 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-g24tn"] Feb 26 15:58:05 crc kubenswrapper[4907]: I0226 15:58:05.229687 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535352-g24tn"] Feb 26 15:58:05 crc kubenswrapper[4907]: I0226 15:58:05.845463 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9lx5z" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerName="console" containerID="cri-o://1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f" gracePeriod=15 Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.132822 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76681648-110a-4f27-a62c-1e4c06da6564" path="/var/lib/kubelet/pods/76681648-110a-4f27-a62c-1e4c06da6564/volumes" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.191189 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9lx5z_0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f/console/0.log" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.191279 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312748 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpn27\" (UniqueName: \"kubernetes.io/projected/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-kube-api-access-cpn27\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-oauth-config\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312872 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-trusted-ca-bundle\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312909 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-service-ca\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312945 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-oauth-serving-cert\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312965 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-config\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.312995 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-serving-cert\") pod \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\" (UID: \"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f\") " Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.314093 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.314180 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-service-ca" (OuterVolumeSpecName: "service-ca") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.314328 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.314921 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-config" (OuterVolumeSpecName: "console-config") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.319787 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-kube-api-access-cpn27" (OuterVolumeSpecName: "kube-api-access-cpn27") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "kube-api-access-cpn27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.320759 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.323389 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" (UID: "0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414250 4907 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414281 4907 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414290 4907 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414297 4907 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414306 4907 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414314 4907 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.414322 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpn27\" (UniqueName: \"kubernetes.io/projected/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f-kube-api-access-cpn27\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.915567 4907 generic.go:334] "Generic (PLEG): container finished" podID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerID="8ba0b61bd4147acf34735df5230fd8d99a93e20404c14e05d32a2d8e6d232943" exitCode=0 Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.915637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" event={"ID":"dea9effb-0863-442e-85b0-ac5bade13bdb","Type":"ContainerDied","Data":"8ba0b61bd4147acf34735df5230fd8d99a93e20404c14e05d32a2d8e6d232943"} Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.918197 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9lx5z_0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f/console/0.log" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.918238 4907 generic.go:334] "Generic (PLEG): container finished" podID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerID="1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f" exitCode=2 Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.918268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9lx5z" event={"ID":"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f","Type":"ContainerDied","Data":"1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f"} Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.918288 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9lx5z" event={"ID":"0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f","Type":"ContainerDied","Data":"be31115ab65a29efb189d2a6c53c2b7546be6f8e45a326225af6e1c0cb24b3d4"} Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.918304 4907 scope.go:117] "RemoveContainer" containerID="1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.918420 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9lx5z" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.941829 4907 scope.go:117] "RemoveContainer" containerID="1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f" Feb 26 15:58:06 crc kubenswrapper[4907]: E0226 15:58:06.942149 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f\": container with ID starting with 1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f not found: ID does not exist" containerID="1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.942176 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f"} err="failed to get container status \"1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f\": rpc error: code = NotFound desc = could not find container \"1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f\": container with ID starting with 1e8efb94e29b2f5eec4ee383a5c98c3e9316e1d77b271be0304c238e08f98c9f not found: ID does not exist" Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.954451 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9lx5z"] Feb 26 15:58:06 crc kubenswrapper[4907]: I0226 15:58:06.961726 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9lx5z"] Feb 26 15:58:07 crc kubenswrapper[4907]: I0226 15:58:07.928713 4907 generic.go:334] "Generic (PLEG): container finished" podID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerID="13af0a7532b50ac893a96956b7d7a57e4c951c741f7b731b1b0ef29c587868ca" exitCode=0 Feb 26 15:58:07 crc kubenswrapper[4907]: I0226 15:58:07.928776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" event={"ID":"dea9effb-0863-442e-85b0-ac5bade13bdb","Type":"ContainerDied","Data":"13af0a7532b50ac893a96956b7d7a57e4c951c741f7b731b1b0ef29c587868ca"} Feb 26 15:58:08 crc kubenswrapper[4907]: I0226 15:58:08.133863 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" path="/var/lib/kubelet/pods/0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f/volumes" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.214442 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.252397 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-bundle\") pod \"dea9effb-0863-442e-85b0-ac5bade13bdb\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.252522 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sghrs\" (UniqueName: \"kubernetes.io/projected/dea9effb-0863-442e-85b0-ac5bade13bdb-kube-api-access-sghrs\") pod \"dea9effb-0863-442e-85b0-ac5bade13bdb\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.252552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-util\") pod \"dea9effb-0863-442e-85b0-ac5bade13bdb\" (UID: \"dea9effb-0863-442e-85b0-ac5bade13bdb\") " Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.253676 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-bundle" (OuterVolumeSpecName: "bundle") pod "dea9effb-0863-442e-85b0-ac5bade13bdb" (UID: "dea9effb-0863-442e-85b0-ac5bade13bdb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.260810 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dea9effb-0863-442e-85b0-ac5bade13bdb-kube-api-access-sghrs" (OuterVolumeSpecName: "kube-api-access-sghrs") pod "dea9effb-0863-442e-85b0-ac5bade13bdb" (UID: "dea9effb-0863-442e-85b0-ac5bade13bdb"). InnerVolumeSpecName "kube-api-access-sghrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.273080 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-util" (OuterVolumeSpecName: "util") pod "dea9effb-0863-442e-85b0-ac5bade13bdb" (UID: "dea9effb-0863-442e-85b0-ac5bade13bdb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.353755 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sghrs\" (UniqueName: \"kubernetes.io/projected/dea9effb-0863-442e-85b0-ac5bade13bdb-kube-api-access-sghrs\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.353785 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.353794 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dea9effb-0863-442e-85b0-ac5bade13bdb-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.945485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" event={"ID":"dea9effb-0863-442e-85b0-ac5bade13bdb","Type":"ContainerDied","Data":"588a8455c32141e910f1343b9d67991d40508063be777e98c22b60fc1515308b"} Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.946138 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="588a8455c32141e910f1343b9d67991d40508063be777e98c22b60fc1515308b" Feb 26 15:58:09 crc kubenswrapper[4907]: I0226 15:58:09.946093 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.498307 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q"] Feb 26 15:58:18 crc kubenswrapper[4907]: E0226 15:58:18.499042 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3d86bc-1eb6-4d67-a762-2000e20fcbd5" containerName="oc" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499055 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3d86bc-1eb6-4d67-a762-2000e20fcbd5" containerName="oc" Feb 26 15:58:18 crc kubenswrapper[4907]: E0226 15:58:18.499066 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="util" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499071 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="util" Feb 26 15:58:18 crc kubenswrapper[4907]: E0226 15:58:18.499083 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="pull" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499090 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="pull" Feb 26 15:58:18 crc kubenswrapper[4907]: E0226 15:58:18.499099 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="extract" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499105 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="extract" Feb 26 15:58:18 crc kubenswrapper[4907]: E0226 15:58:18.499121 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerName="console" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499128 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerName="console" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499227 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3d86bc-1eb6-4d67-a762-2000e20fcbd5" containerName="oc" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499241 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="dea9effb-0863-442e-85b0-ac5bade13bdb" containerName="extract" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499252 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eb5d98d-ab9b-47bd-b8ca-27340a4d6a4f" containerName="console" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.499734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.502356 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.502783 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.502808 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.503201 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.512424 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-wfqpg" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.522260 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q"] Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.573879 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kz65\" (UniqueName: \"kubernetes.io/projected/93c2e5d2-ce7d-47db-a76b-85f1988e1864-kube-api-access-8kz65\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.573929 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93c2e5d2-ce7d-47db-a76b-85f1988e1864-webhook-cert\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.574058 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93c2e5d2-ce7d-47db-a76b-85f1988e1864-apiservice-cert\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.674729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93c2e5d2-ce7d-47db-a76b-85f1988e1864-webhook-cert\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.675060 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93c2e5d2-ce7d-47db-a76b-85f1988e1864-apiservice-cert\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.675186 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kz65\" (UniqueName: \"kubernetes.io/projected/93c2e5d2-ce7d-47db-a76b-85f1988e1864-kube-api-access-8kz65\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.682236 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/93c2e5d2-ce7d-47db-a76b-85f1988e1864-webhook-cert\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.684278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/93c2e5d2-ce7d-47db-a76b-85f1988e1864-apiservice-cert\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.696327 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kz65\" (UniqueName: \"kubernetes.io/projected/93c2e5d2-ce7d-47db-a76b-85f1988e1864-kube-api-access-8kz65\") pod \"metallb-operator-controller-manager-569bd5c9fd-jkl8q\" (UID: \"93c2e5d2-ce7d-47db-a76b-85f1988e1864\") " pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.812621 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.864982 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5448c47665-smhgw"] Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.865693 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.869024 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.869239 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.869369 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2ppxv" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.895198 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5448c47665-smhgw"] Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.977648 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04570909-662d-4a9e-9f62-fbca4b92bfa7-webhook-cert\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.977698 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04570909-662d-4a9e-9f62-fbca4b92bfa7-apiservice-cert\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:18 crc kubenswrapper[4907]: I0226 15:58:18.977734 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk2gg\" (UniqueName: \"kubernetes.io/projected/04570909-662d-4a9e-9f62-fbca4b92bfa7-kube-api-access-lk2gg\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.079016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04570909-662d-4a9e-9f62-fbca4b92bfa7-apiservice-cert\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.079377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk2gg\" (UniqueName: \"kubernetes.io/projected/04570909-662d-4a9e-9f62-fbca4b92bfa7-kube-api-access-lk2gg\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.081256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04570909-662d-4a9e-9f62-fbca4b92bfa7-webhook-cert\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.095027 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/04570909-662d-4a9e-9f62-fbca4b92bfa7-webhook-cert\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.106473 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/04570909-662d-4a9e-9f62-fbca4b92bfa7-apiservice-cert\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.120357 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk2gg\" (UniqueName: \"kubernetes.io/projected/04570909-662d-4a9e-9f62-fbca4b92bfa7-kube-api-access-lk2gg\") pod \"metallb-operator-webhook-server-5448c47665-smhgw\" (UID: \"04570909-662d-4a9e-9f62-fbca4b92bfa7\") " pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.180709 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.184816 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q"] Feb 26 15:58:19 crc kubenswrapper[4907]: I0226 15:58:19.505057 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5448c47665-smhgw"] Feb 26 15:58:19 crc kubenswrapper[4907]: W0226 15:58:19.508079 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04570909_662d_4a9e_9f62_fbca4b92bfa7.slice/crio-7ef68a95b3da9393e5d4d96376e8a78840f145a8d278d35c369166bb6fc9b630 WatchSource:0}: Error finding container 7ef68a95b3da9393e5d4d96376e8a78840f145a8d278d35c369166bb6fc9b630: Status 404 returned error can't find the container with id 7ef68a95b3da9393e5d4d96376e8a78840f145a8d278d35c369166bb6fc9b630 Feb 26 15:58:20 crc kubenswrapper[4907]: I0226 15:58:20.033429 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" event={"ID":"93c2e5d2-ce7d-47db-a76b-85f1988e1864","Type":"ContainerStarted","Data":"cd47e8ca8ef2e72aaba54671a3ce85297fa4ba015cf048b11d92f4e7e775438e"} Feb 26 15:58:20 crc kubenswrapper[4907]: I0226 15:58:20.037015 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" event={"ID":"04570909-662d-4a9e-9f62-fbca4b92bfa7","Type":"ContainerStarted","Data":"7ef68a95b3da9393e5d4d96376e8a78840f145a8d278d35c369166bb6fc9b630"} Feb 26 15:58:23 crc kubenswrapper[4907]: I0226 15:58:23.865160 4907 scope.go:117] "RemoveContainer" containerID="1a0c054792c5c726f79413f5f09de926e52ff9f77ea5855b8d3c35b09b90a4c4" Feb 26 15:58:25 crc kubenswrapper[4907]: I0226 15:58:25.063667 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" event={"ID":"93c2e5d2-ce7d-47db-a76b-85f1988e1864","Type":"ContainerStarted","Data":"ccfbf488603dcf22372e464c9f6d37aef406a69da5951856650d4c7b3de82d34"} Feb 26 15:58:25 crc kubenswrapper[4907]: I0226 15:58:25.063981 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:25 crc kubenswrapper[4907]: I0226 15:58:25.065418 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" event={"ID":"04570909-662d-4a9e-9f62-fbca4b92bfa7","Type":"ContainerStarted","Data":"7f26d7f004450ef18e238d6a28eb62bc30b3e65b85387305f397fbe253912243"} Feb 26 15:58:25 crc kubenswrapper[4907]: I0226 15:58:25.065971 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:25 crc kubenswrapper[4907]: I0226 15:58:25.085784 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" podStartSLOduration=1.7889219509999998 podStartE2EDuration="7.085765213s" podCreationTimestamp="2026-02-26 15:58:18 +0000 UTC" firstStartedPulling="2026-02-26 15:58:19.199948328 +0000 UTC m=+961.718510177" lastFinishedPulling="2026-02-26 15:58:24.49679159 +0000 UTC m=+967.015353439" observedRunningTime="2026-02-26 15:58:25.083836787 +0000 UTC m=+967.602398656" watchObservedRunningTime="2026-02-26 15:58:25.085765213 +0000 UTC m=+967.604327062" Feb 26 15:58:25 crc kubenswrapper[4907]: I0226 15:58:25.121256 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" podStartSLOduration=2.122337312 podStartE2EDuration="7.121237282s" podCreationTimestamp="2026-02-26 15:58:18 +0000 UTC" firstStartedPulling="2026-02-26 15:58:19.510662898 +0000 UTC m=+962.029224747" lastFinishedPulling="2026-02-26 15:58:24.509562868 +0000 UTC m=+967.028124717" observedRunningTime="2026-02-26 15:58:25.120645947 +0000 UTC m=+967.639207796" watchObservedRunningTime="2026-02-26 15:58:25.121237282 +0000 UTC m=+967.639799131" Feb 26 15:58:39 crc kubenswrapper[4907]: I0226 15:58:39.186637 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5448c47665-smhgw" Feb 26 15:58:48 crc kubenswrapper[4907]: I0226 15:58:48.530828 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:58:48 crc kubenswrapper[4907]: I0226 15:58:48.531496 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:58:58 crc kubenswrapper[4907]: I0226 15:58:58.816011 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-569bd5c9fd-jkl8q" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.617781 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-2kml2"] Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.620011 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.630028 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5"] Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.630642 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.633696 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.634545 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-65klr" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.634673 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.634797 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.645301 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5"] Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.728455 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7hcct"] Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.729282 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.730800 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.731198 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.731758 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.732123 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-z5t4z" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.733245 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-qwvw9"] Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.734143 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.743798 4907 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764089 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-frr-conf\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764181 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aedab463-da2b-4bf1-a67d-16439f225983-frr-startup\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764209 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-reloader\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764240 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fa6e66-60dc-44b8-a6a6-47a7ec18424f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8kcg5\" (UID: \"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764263 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aedab463-da2b-4bf1-a67d-16439f225983-metrics-certs\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764304 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk4cm\" (UniqueName: \"kubernetes.io/projected/aedab463-da2b-4bf1-a67d-16439f225983-kube-api-access-jk4cm\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764332 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjnp4\" (UniqueName: \"kubernetes.io/projected/e3fa6e66-60dc-44b8-a6a6-47a7ec18424f-kube-api-access-pjnp4\") pod \"frr-k8s-webhook-server-7f989f654f-8kcg5\" (UID: \"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-metrics\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.764390 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-frr-sockets\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.780098 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-qwvw9"] Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.865989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjnp4\" (UniqueName: \"kubernetes.io/projected/e3fa6e66-60dc-44b8-a6a6-47a7ec18424f-kube-api-access-pjnp4\") pod \"frr-k8s-webhook-server-7f989f654f-8kcg5\" (UID: \"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866040 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4cm\" (UniqueName: \"kubernetes.io/projected/aedab463-da2b-4bf1-a67d-16439f225983-kube-api-access-jk4cm\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866078 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-metrics\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866104 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-frr-sockets\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866142 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05e6312b-9683-44bf-9368-cb234744fd33-cert\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866165 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-frr-conf\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866194 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05e6312b-9683-44bf-9368-cb234744fd33-metrics-certs\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aedab463-da2b-4bf1-a67d-16439f225983-frr-startup\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-reloader\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866281 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj68h\" (UniqueName: \"kubernetes.io/projected/05e6312b-9683-44bf-9368-cb234744fd33-kube-api-access-dj68h\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866304 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fa6e66-60dc-44b8-a6a6-47a7ec18424f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8kcg5\" (UID: \"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866322 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aedab463-da2b-4bf1-a67d-16439f225983-metrics-certs\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866345 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866382 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metallb-excludel2\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metrics-certs\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qg4k\" (UniqueName: \"kubernetes.io/projected/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-kube-api-access-6qg4k\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.866806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-metrics\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.867058 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-frr-conf\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.867060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-frr-sockets\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: E0226 15:58:59.867126 4907 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 26 15:58:59 crc kubenswrapper[4907]: E0226 15:58:59.867161 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aedab463-da2b-4bf1-a67d-16439f225983-metrics-certs podName:aedab463-da2b-4bf1-a67d-16439f225983 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:00.367149785 +0000 UTC m=+1002.885711634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aedab463-da2b-4bf1-a67d-16439f225983-metrics-certs") pod "frr-k8s-2kml2" (UID: "aedab463-da2b-4bf1-a67d-16439f225983") : secret "frr-k8s-certs-secret" not found Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.867395 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/aedab463-da2b-4bf1-a67d-16439f225983-reloader\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.867453 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/aedab463-da2b-4bf1-a67d-16439f225983-frr-startup\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.873392 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3fa6e66-60dc-44b8-a6a6-47a7ec18424f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-8kcg5\" (UID: \"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.888701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjnp4\" (UniqueName: \"kubernetes.io/projected/e3fa6e66-60dc-44b8-a6a6-47a7ec18424f-kube-api-access-pjnp4\") pod \"frr-k8s-webhook-server-7f989f654f-8kcg5\" (UID: \"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.900942 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4cm\" (UniqueName: \"kubernetes.io/projected/aedab463-da2b-4bf1-a67d-16439f225983-kube-api-access-jk4cm\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.951056 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.967821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05e6312b-9683-44bf-9368-cb234744fd33-cert\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.968097 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05e6312b-9683-44bf-9368-cb234744fd33-metrics-certs\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.968243 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj68h\" (UniqueName: \"kubernetes.io/projected/05e6312b-9683-44bf-9368-cb234744fd33-kube-api-access-dj68h\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.968356 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.968465 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metallb-excludel2\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.968559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metrics-certs\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.968680 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qg4k\" (UniqueName: \"kubernetes.io/projected/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-kube-api-access-6qg4k\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: E0226 15:58:59.968645 4907 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 15:58:59 crc kubenswrapper[4907]: E0226 15:58:59.969010 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist podName:b4841c1c-c56d-4abe-b6a7-92211b5c4a19 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:00.468988259 +0000 UTC m=+1002.987550108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist") pod "speaker-7hcct" (UID: "b4841c1c-c56d-4abe-b6a7-92211b5c4a19") : secret "metallb-memberlist" not found Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.969209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metallb-excludel2\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: E0226 15:58:59.968786 4907 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 26 15:58:59 crc kubenswrapper[4907]: E0226 15:58:59.969298 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metrics-certs podName:b4841c1c-c56d-4abe-b6a7-92211b5c4a19 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:00.469283877 +0000 UTC m=+1002.987845716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metrics-certs") pod "speaker-7hcct" (UID: "b4841c1c-c56d-4abe-b6a7-92211b5c4a19") : secret "speaker-certs-secret" not found Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.971869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05e6312b-9683-44bf-9368-cb234744fd33-cert\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.972887 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/05e6312b-9683-44bf-9368-cb234744fd33-metrics-certs\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.987792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qg4k\" (UniqueName: \"kubernetes.io/projected/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-kube-api-access-6qg4k\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:58:59 crc kubenswrapper[4907]: I0226 15:58:59.990522 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj68h\" (UniqueName: \"kubernetes.io/projected/05e6312b-9683-44bf-9368-cb234744fd33-kube-api-access-dj68h\") pod \"controller-86ddb6bd46-qwvw9\" (UID: \"05e6312b-9683-44bf-9368-cb234744fd33\") " pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.049627 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.315192 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-qwvw9"] Feb 26 15:59:00 crc kubenswrapper[4907]: W0226 15:59:00.317518 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e6312b_9683_44bf_9368_cb234744fd33.slice/crio-fc8c3db1df957b7a09756a392678c106bec9902a8d5144e8659bbbc0dc318a27 WatchSource:0}: Error finding container fc8c3db1df957b7a09756a392678c106bec9902a8d5144e8659bbbc0dc318a27: Status 404 returned error can't find the container with id fc8c3db1df957b7a09756a392678c106bec9902a8d5144e8659bbbc0dc318a27 Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.373658 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aedab463-da2b-4bf1-a67d-16439f225983-metrics-certs\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.378581 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aedab463-da2b-4bf1-a67d-16439f225983-metrics-certs\") pod \"frr-k8s-2kml2\" (UID: \"aedab463-da2b-4bf1-a67d-16439f225983\") " pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.385405 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5"] Feb 26 15:59:00 crc kubenswrapper[4907]: W0226 15:59:00.393740 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3fa6e66_60dc_44b8_a6a6_47a7ec18424f.slice/crio-7eedd8bad132b95656e0ebcba74931ad2e110b503d83f41aef593ac6be7d5803 WatchSource:0}: Error finding container 7eedd8bad132b95656e0ebcba74931ad2e110b503d83f41aef593ac6be7d5803: Status 404 returned error can't find the container with id 7eedd8bad132b95656e0ebcba74931ad2e110b503d83f41aef593ac6be7d5803 Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.475327 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.475689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metrics-certs\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:59:00 crc kubenswrapper[4907]: E0226 15:59:00.475549 4907 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 15:59:00 crc kubenswrapper[4907]: E0226 15:59:00.475824 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist podName:b4841c1c-c56d-4abe-b6a7-92211b5c4a19 nodeName:}" failed. No retries permitted until 2026-02-26 15:59:01.475801865 +0000 UTC m=+1003.994363794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist") pod "speaker-7hcct" (UID: "b4841c1c-c56d-4abe-b6a7-92211b5c4a19") : secret "metallb-memberlist" not found Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.479449 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-metrics-certs\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:59:00 crc kubenswrapper[4907]: I0226 15:59:00.538794 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.288524 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" event={"ID":"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f","Type":"ContainerStarted","Data":"7eedd8bad132b95656e0ebcba74931ad2e110b503d83f41aef593ac6be7d5803"} Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.290659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-qwvw9" event={"ID":"05e6312b-9683-44bf-9368-cb234744fd33","Type":"ContainerStarted","Data":"d61c662f4f22677db3db860ca060f400aa2e4aa1feacd5a15c5a0eee70cedd2a"} Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.290709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-qwvw9" event={"ID":"05e6312b-9683-44bf-9368-cb234744fd33","Type":"ContainerStarted","Data":"0b0ed690b1b252f0ea2772e2b3083f6bfdb8cdb2e763f84d19e3c7f12b135f50"} Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.290723 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-qwvw9" event={"ID":"05e6312b-9683-44bf-9368-cb234744fd33","Type":"ContainerStarted","Data":"fc8c3db1df957b7a09756a392678c106bec9902a8d5144e8659bbbc0dc318a27"} Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.292081 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"1b83d5478f2b44f145caeb5d49f09921945c797d017fc3f9bddc78db5fe507d6"} Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.324988 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-qwvw9" podStartSLOduration=2.324958526 podStartE2EDuration="2.324958526s" podCreationTimestamp="2026-02-26 15:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:59:01.314717079 +0000 UTC m=+1003.833278928" watchObservedRunningTime="2026-02-26 15:59:01.324958526 +0000 UTC m=+1003.843520405" Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.489647 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.507907 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b4841c1c-c56d-4abe-b6a7-92211b5c4a19-memberlist\") pod \"speaker-7hcct\" (UID: \"b4841c1c-c56d-4abe-b6a7-92211b5c4a19\") " pod="metallb-system/speaker-7hcct" Feb 26 15:59:01 crc kubenswrapper[4907]: I0226 15:59:01.542510 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7hcct" Feb 26 15:59:01 crc kubenswrapper[4907]: W0226 15:59:01.576375 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4841c1c_c56d_4abe_b6a7_92211b5c4a19.slice/crio-db9b4ce3fcc01a2ecb959c22aa80394018a0db4843f9a803d434875d89275129 WatchSource:0}: Error finding container db9b4ce3fcc01a2ecb959c22aa80394018a0db4843f9a803d434875d89275129: Status 404 returned error can't find the container with id db9b4ce3fcc01a2ecb959c22aa80394018a0db4843f9a803d434875d89275129 Feb 26 15:59:02 crc kubenswrapper[4907]: I0226 15:59:02.362463 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7hcct" event={"ID":"b4841c1c-c56d-4abe-b6a7-92211b5c4a19","Type":"ContainerStarted","Data":"74ab3fecbcab2cec6b3a49bdad2f13c2eba014f1436c31e217f03202ade9512f"} Feb 26 15:59:02 crc kubenswrapper[4907]: I0226 15:59:02.362836 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7hcct" event={"ID":"b4841c1c-c56d-4abe-b6a7-92211b5c4a19","Type":"ContainerStarted","Data":"bd50d4231ba169411e06da2c481f3de64a1fae3c293ed76e1a8e51ce30ee508b"} Feb 26 15:59:02 crc kubenswrapper[4907]: I0226 15:59:02.362848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7hcct" event={"ID":"b4841c1c-c56d-4abe-b6a7-92211b5c4a19","Type":"ContainerStarted","Data":"db9b4ce3fcc01a2ecb959c22aa80394018a0db4843f9a803d434875d89275129"} Feb 26 15:59:02 crc kubenswrapper[4907]: I0226 15:59:02.362932 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:59:02 crc kubenswrapper[4907]: I0226 15:59:02.362974 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7hcct" Feb 26 15:59:08 crc kubenswrapper[4907]: I0226 15:59:08.160549 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7hcct" podStartSLOduration=9.160532357 podStartE2EDuration="9.160532357s" podCreationTimestamp="2026-02-26 15:58:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 15:59:02.397271658 +0000 UTC m=+1004.915833507" watchObservedRunningTime="2026-02-26 15:59:08.160532357 +0000 UTC m=+1010.679094206" Feb 26 15:59:08 crc kubenswrapper[4907]: I0226 15:59:08.403946 4907 generic.go:334] "Generic (PLEG): container finished" podID="aedab463-da2b-4bf1-a67d-16439f225983" containerID="1009245fd111110e6bccd4f70d8d9ed9feda31703a278adaebce8de3539c474e" exitCode=0 Feb 26 15:59:08 crc kubenswrapper[4907]: I0226 15:59:08.404041 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerDied","Data":"1009245fd111110e6bccd4f70d8d9ed9feda31703a278adaebce8de3539c474e"} Feb 26 15:59:08 crc kubenswrapper[4907]: I0226 15:59:08.408291 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" event={"ID":"e3fa6e66-60dc-44b8-a6a6-47a7ec18424f","Type":"ContainerStarted","Data":"0de0f92a876b73d98baa5f743beb73d3d6196a2c2211b12caed7f9f227fbf7f9"} Feb 26 15:59:08 crc kubenswrapper[4907]: I0226 15:59:08.408560 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:59:08 crc kubenswrapper[4907]: I0226 15:59:08.484173 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" podStartSLOduration=1.782083997 podStartE2EDuration="9.484157609s" podCreationTimestamp="2026-02-26 15:58:59 +0000 UTC" firstStartedPulling="2026-02-26 15:59:00.395566633 +0000 UTC m=+1002.914128482" lastFinishedPulling="2026-02-26 15:59:08.097640245 +0000 UTC m=+1010.616202094" observedRunningTime="2026-02-26 15:59:08.480097181 +0000 UTC m=+1010.998659030" watchObservedRunningTime="2026-02-26 15:59:08.484157609 +0000 UTC m=+1011.002719458" Feb 26 15:59:09 crc kubenswrapper[4907]: I0226 15:59:09.415355 4907 generic.go:334] "Generic (PLEG): container finished" podID="aedab463-da2b-4bf1-a67d-16439f225983" containerID="e50e22ade4892b4390bd0adffeda76b662eaa882498cb44725c036f0ec3f6356" exitCode=0 Feb 26 15:59:09 crc kubenswrapper[4907]: I0226 15:59:09.415761 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerDied","Data":"e50e22ade4892b4390bd0adffeda76b662eaa882498cb44725c036f0ec3f6356"} Feb 26 15:59:10 crc kubenswrapper[4907]: I0226 15:59:10.053723 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-qwvw9" Feb 26 15:59:10 crc kubenswrapper[4907]: I0226 15:59:10.424528 4907 generic.go:334] "Generic (PLEG): container finished" podID="aedab463-da2b-4bf1-a67d-16439f225983" containerID="8890b8c8f68669d2851f6c83c716d044c8b3016e46457acd28eaaa0f25da2d98" exitCode=0 Feb 26 15:59:10 crc kubenswrapper[4907]: I0226 15:59:10.424578 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerDied","Data":"8890b8c8f68669d2851f6c83c716d044c8b3016e46457acd28eaaa0f25da2d98"} Feb 26 15:59:11 crc kubenswrapper[4907]: I0226 15:59:11.432201 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"e3d3672e3b40967babf2f333b014ab2ea51060b407cb7882d6e0ba727320974e"} Feb 26 15:59:11 crc kubenswrapper[4907]: I0226 15:59:11.432523 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"a04f1bac76f2622356b7b4e54a315cae4388e9ad3eb557a924a1500811726725"} Feb 26 15:59:11 crc kubenswrapper[4907]: I0226 15:59:11.546237 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7hcct" Feb 26 15:59:12 crc kubenswrapper[4907]: I0226 15:59:12.440914 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"6b09c55762250b9ff20d7543d4fe8fd0a363f44f7c73b8b08cd063fbc70b4e29"} Feb 26 15:59:12 crc kubenswrapper[4907]: I0226 15:59:12.441192 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"87f925c81eebdc679405f1dbc52e6a715f383a7aa29052b536fe82e4874a2a12"} Feb 26 15:59:13 crc kubenswrapper[4907]: I0226 15:59:13.450787 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"9a7afc6ea3794ed0700361db9d5c170d25db242adcb55ea55c7aa0a7f486e79f"} Feb 26 15:59:13 crc kubenswrapper[4907]: I0226 15:59:13.450826 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-2kml2" event={"ID":"aedab463-da2b-4bf1-a67d-16439f225983","Type":"ContainerStarted","Data":"2cccbf17451fb428e885bd67e63e3d7ae2ae0dbc27030892cad02411fba8e2c8"} Feb 26 15:59:13 crc kubenswrapper[4907]: I0226 15:59:13.450980 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:13 crc kubenswrapper[4907]: I0226 15:59:13.477731 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-2kml2" podStartSLOduration=7.094920378 podStartE2EDuration="14.477714552s" podCreationTimestamp="2026-02-26 15:58:59 +0000 UTC" firstStartedPulling="2026-02-26 15:59:00.676713598 +0000 UTC m=+1003.195275447" lastFinishedPulling="2026-02-26 15:59:08.059507772 +0000 UTC m=+1010.578069621" observedRunningTime="2026-02-26 15:59:13.474218316 +0000 UTC m=+1015.992780195" watchObservedRunningTime="2026-02-26 15:59:13.477714552 +0000 UTC m=+1015.996276401" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.437452 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ldb8j"] Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.438488 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.442529 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-xhn6z" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.443234 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.445000 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.451787 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ldb8j"] Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.589620 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvvb\" (UniqueName: \"kubernetes.io/projected/7a83fde4-3660-4aa5-8bdd-ad32bfcc704c-kube-api-access-8rvvb\") pod \"openstack-operator-index-ldb8j\" (UID: \"7a83fde4-3660-4aa5-8bdd-ad32bfcc704c\") " pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.691780 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvvb\" (UniqueName: \"kubernetes.io/projected/7a83fde4-3660-4aa5-8bdd-ad32bfcc704c-kube-api-access-8rvvb\") pod \"openstack-operator-index-ldb8j\" (UID: \"7a83fde4-3660-4aa5-8bdd-ad32bfcc704c\") " pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.711533 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvvb\" (UniqueName: \"kubernetes.io/projected/7a83fde4-3660-4aa5-8bdd-ad32bfcc704c-kube-api-access-8rvvb\") pod \"openstack-operator-index-ldb8j\" (UID: \"7a83fde4-3660-4aa5-8bdd-ad32bfcc704c\") " pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.752389 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:14 crc kubenswrapper[4907]: I0226 15:59:14.939075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ldb8j"] Feb 26 15:59:15 crc kubenswrapper[4907]: I0226 15:59:15.464341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ldb8j" event={"ID":"7a83fde4-3660-4aa5-8bdd-ad32bfcc704c","Type":"ContainerStarted","Data":"21c535c41499adbc3ea7192333e4f05e3bcc896ee3e86df1327afdced81a2e66"} Feb 26 15:59:15 crc kubenswrapper[4907]: I0226 15:59:15.539266 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:15 crc kubenswrapper[4907]: I0226 15:59:15.647966 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:18 crc kubenswrapper[4907]: I0226 15:59:18.530566 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:59:18 crc kubenswrapper[4907]: I0226 15:59:18.530894 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:59:19 crc kubenswrapper[4907]: I0226 15:59:19.497266 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ldb8j" event={"ID":"7a83fde4-3660-4aa5-8bdd-ad32bfcc704c","Type":"ContainerStarted","Data":"88369092f8c2f79a3059c559a7fce52fdbf3a3163b11dece5876163a8efd5f16"} Feb 26 15:59:19 crc kubenswrapper[4907]: I0226 15:59:19.516946 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ldb8j" podStartSLOduration=1.7638571490000001 podStartE2EDuration="5.516924709s" podCreationTimestamp="2026-02-26 15:59:14 +0000 UTC" firstStartedPulling="2026-02-26 15:59:14.956420748 +0000 UTC m=+1017.474982587" lastFinishedPulling="2026-02-26 15:59:18.709488298 +0000 UTC m=+1021.228050147" observedRunningTime="2026-02-26 15:59:19.515193787 +0000 UTC m=+1022.033755656" watchObservedRunningTime="2026-02-26 15:59:19.516924709 +0000 UTC m=+1022.035486558" Feb 26 15:59:19 crc kubenswrapper[4907]: I0226 15:59:19.958126 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-8kcg5" Feb 26 15:59:20 crc kubenswrapper[4907]: I0226 15:59:20.541599 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-2kml2" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.626755 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sfqpl"] Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.628457 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.642746 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfqpl"] Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.694059 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-catalog-content\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.694193 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-utilities\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.694233 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5wkk\" (UniqueName: \"kubernetes.io/projected/b7a3bde7-a988-429e-a74a-bc06d59143a4-kube-api-access-b5wkk\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.795291 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-utilities\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.795357 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5wkk\" (UniqueName: \"kubernetes.io/projected/b7a3bde7-a988-429e-a74a-bc06d59143a4-kube-api-access-b5wkk\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.795399 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-catalog-content\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.795885 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-utilities\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.795930 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-catalog-content\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.815575 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5wkk\" (UniqueName: \"kubernetes.io/projected/b7a3bde7-a988-429e-a74a-bc06d59143a4-kube-api-access-b5wkk\") pod \"redhat-marketplace-sfqpl\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:21 crc kubenswrapper[4907]: I0226 15:59:21.950122 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:22 crc kubenswrapper[4907]: I0226 15:59:22.189514 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfqpl"] Feb 26 15:59:22 crc kubenswrapper[4907]: I0226 15:59:22.516801 4907 generic.go:334] "Generic (PLEG): container finished" podID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerID="bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011" exitCode=0 Feb 26 15:59:22 crc kubenswrapper[4907]: I0226 15:59:22.516871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfqpl" event={"ID":"b7a3bde7-a988-429e-a74a-bc06d59143a4","Type":"ContainerDied","Data":"bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011"} Feb 26 15:59:22 crc kubenswrapper[4907]: I0226 15:59:22.517089 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfqpl" event={"ID":"b7a3bde7-a988-429e-a74a-bc06d59143a4","Type":"ContainerStarted","Data":"525a38d68865c0eceff4fccad499b56abee41951a6d4d3b4c135d3ca342ca251"} Feb 26 15:59:23 crc kubenswrapper[4907]: I0226 15:59:23.524198 4907 generic.go:334] "Generic (PLEG): container finished" podID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerID="1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717" exitCode=0 Feb 26 15:59:23 crc kubenswrapper[4907]: I0226 15:59:23.524240 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfqpl" event={"ID":"b7a3bde7-a988-429e-a74a-bc06d59143a4","Type":"ContainerDied","Data":"1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717"} Feb 26 15:59:24 crc kubenswrapper[4907]: I0226 15:59:24.531392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfqpl" event={"ID":"b7a3bde7-a988-429e-a74a-bc06d59143a4","Type":"ContainerStarted","Data":"836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25"} Feb 26 15:59:24 crc kubenswrapper[4907]: I0226 15:59:24.558551 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sfqpl" podStartSLOduration=2.120041541 podStartE2EDuration="3.558531074s" podCreationTimestamp="2026-02-26 15:59:21 +0000 UTC" firstStartedPulling="2026-02-26 15:59:22.518379549 +0000 UTC m=+1025.036941398" lastFinishedPulling="2026-02-26 15:59:23.956869082 +0000 UTC m=+1026.475430931" observedRunningTime="2026-02-26 15:59:24.554060945 +0000 UTC m=+1027.072622804" watchObservedRunningTime="2026-02-26 15:59:24.558531074 +0000 UTC m=+1027.077092923" Feb 26 15:59:24 crc kubenswrapper[4907]: I0226 15:59:24.752614 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:24 crc kubenswrapper[4907]: I0226 15:59:24.752690 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:24 crc kubenswrapper[4907]: I0226 15:59:24.780799 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:25 crc kubenswrapper[4907]: I0226 15:59:25.559267 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ldb8j" Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.848524 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4"] Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.849863 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.856828 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6k2xh" Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.866911 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4"] Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.930095 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99grb\" (UniqueName: \"kubernetes.io/projected/8a2e47e7-4347-4860-8c91-5a2b12ae1066-kube-api-access-99grb\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.930350 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-util\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:30 crc kubenswrapper[4907]: I0226 15:59:30.930392 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-bundle\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.031735 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-util\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.031827 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-bundle\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.031901 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99grb\" (UniqueName: \"kubernetes.io/projected/8a2e47e7-4347-4860-8c91-5a2b12ae1066-kube-api-access-99grb\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.032305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-util\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.032404 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-bundle\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.050438 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99grb\" (UniqueName: \"kubernetes.io/projected/8a2e47e7-4347-4860-8c91-5a2b12ae1066-kube-api-access-99grb\") pod \"ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.166856 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.411341 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4"] Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.573064 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" event={"ID":"8a2e47e7-4347-4860-8c91-5a2b12ae1066","Type":"ContainerStarted","Data":"ba4b2231983fb5e7405f025e48d411c8839f832175e8f2275db6ee67de7367c4"} Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.951061 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:31 crc kubenswrapper[4907]: I0226 15:59:31.951116 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:32 crc kubenswrapper[4907]: I0226 15:59:32.007904 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:32 crc kubenswrapper[4907]: I0226 15:59:32.601636 4907 generic.go:334] "Generic (PLEG): container finished" podID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerID="c6d59a93daacd43c131f91e3dc2802a0fae18f9cb61894edcdadad779b61aae8" exitCode=0 Feb 26 15:59:32 crc kubenswrapper[4907]: I0226 15:59:32.601715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" event={"ID":"8a2e47e7-4347-4860-8c91-5a2b12ae1066","Type":"ContainerDied","Data":"c6d59a93daacd43c131f91e3dc2802a0fae18f9cb61894edcdadad779b61aae8"} Feb 26 15:59:32 crc kubenswrapper[4907]: I0226 15:59:32.642871 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:33 crc kubenswrapper[4907]: I0226 15:59:33.609633 4907 generic.go:334] "Generic (PLEG): container finished" podID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerID="b1e7361932452d929dfa730233e6bac3c41271e5ba8b85076d366e5f7eb7bcec" exitCode=0 Feb 26 15:59:33 crc kubenswrapper[4907]: I0226 15:59:33.609740 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" event={"ID":"8a2e47e7-4347-4860-8c91-5a2b12ae1066","Type":"ContainerDied","Data":"b1e7361932452d929dfa730233e6bac3c41271e5ba8b85076d366e5f7eb7bcec"} Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.221988 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g5kc9"] Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.223074 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.240192 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g5kc9"] Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.276092 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2f4\" (UniqueName: \"kubernetes.io/projected/015f7bce-7a88-46a1-a851-3d1aad21abc8-kube-api-access-jp2f4\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.276236 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-utilities\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.276260 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-catalog-content\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.377888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-utilities\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.377930 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-catalog-content\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.378001 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2f4\" (UniqueName: \"kubernetes.io/projected/015f7bce-7a88-46a1-a851-3d1aad21abc8-kube-api-access-jp2f4\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.378458 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-utilities\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.378699 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-catalog-content\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.399844 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2f4\" (UniqueName: \"kubernetes.io/projected/015f7bce-7a88-46a1-a851-3d1aad21abc8-kube-api-access-jp2f4\") pod \"community-operators-g5kc9\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.561422 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.623267 4907 generic.go:334] "Generic (PLEG): container finished" podID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerID="bad29965b077811580ccdd546034bf9e65d8dc7a0266abce9a41efcca69273c2" exitCode=0 Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.623527 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" event={"ID":"8a2e47e7-4347-4860-8c91-5a2b12ae1066","Type":"ContainerDied","Data":"bad29965b077811580ccdd546034bf9e65d8dc7a0266abce9a41efcca69273c2"} Feb 26 15:59:34 crc kubenswrapper[4907]: I0226 15:59:34.932917 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g5kc9"] Feb 26 15:59:35 crc kubenswrapper[4907]: I0226 15:59:35.631360 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerStarted","Data":"5c4559ecc4ecabe32219bca5ea1324b4885529a60dd4c02cde971044aa268a0c"} Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.132152 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.205926 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-bundle\") pod \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.206552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99grb\" (UniqueName: \"kubernetes.io/projected/8a2e47e7-4347-4860-8c91-5a2b12ae1066-kube-api-access-99grb\") pod \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.206810 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-util\") pod \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\" (UID: \"8a2e47e7-4347-4860-8c91-5a2b12ae1066\") " Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.206898 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-bundle" (OuterVolumeSpecName: "bundle") pod "8a2e47e7-4347-4860-8c91-5a2b12ae1066" (UID: "8a2e47e7-4347-4860-8c91-5a2b12ae1066"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.207478 4907 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.227810 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2e47e7-4347-4860-8c91-5a2b12ae1066-kube-api-access-99grb" (OuterVolumeSpecName: "kube-api-access-99grb") pod "8a2e47e7-4347-4860-8c91-5a2b12ae1066" (UID: "8a2e47e7-4347-4860-8c91-5a2b12ae1066"). InnerVolumeSpecName "kube-api-access-99grb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.230490 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-util" (OuterVolumeSpecName: "util") pod "8a2e47e7-4347-4860-8c91-5a2b12ae1066" (UID: "8a2e47e7-4347-4860-8c91-5a2b12ae1066"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.309255 4907 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8a2e47e7-4347-4860-8c91-5a2b12ae1066-util\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.309303 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99grb\" (UniqueName: \"kubernetes.io/projected/8a2e47e7-4347-4860-8c91-5a2b12ae1066-kube-api-access-99grb\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.638038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" event={"ID":"8a2e47e7-4347-4860-8c91-5a2b12ae1066","Type":"ContainerDied","Data":"ba4b2231983fb5e7405f025e48d411c8839f832175e8f2275db6ee67de7367c4"} Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.639139 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba4b2231983fb5e7405f025e48d411c8839f832175e8f2275db6ee67de7367c4" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.638101 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4" Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.640081 4907 generic.go:334] "Generic (PLEG): container finished" podID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerID="63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1" exitCode=0 Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.640128 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerDied","Data":"63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1"} Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.818101 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfqpl"] Feb 26 15:59:36 crc kubenswrapper[4907]: I0226 15:59:36.818766 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sfqpl" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="registry-server" containerID="cri-o://836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25" gracePeriod=2 Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.178669 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.221670 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5wkk\" (UniqueName: \"kubernetes.io/projected/b7a3bde7-a988-429e-a74a-bc06d59143a4-kube-api-access-b5wkk\") pod \"b7a3bde7-a988-429e-a74a-bc06d59143a4\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.221754 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-catalog-content\") pod \"b7a3bde7-a988-429e-a74a-bc06d59143a4\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.221896 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-utilities\") pod \"b7a3bde7-a988-429e-a74a-bc06d59143a4\" (UID: \"b7a3bde7-a988-429e-a74a-bc06d59143a4\") " Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.226330 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-utilities" (OuterVolumeSpecName: "utilities") pod "b7a3bde7-a988-429e-a74a-bc06d59143a4" (UID: "b7a3bde7-a988-429e-a74a-bc06d59143a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.229787 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a3bde7-a988-429e-a74a-bc06d59143a4-kube-api-access-b5wkk" (OuterVolumeSpecName: "kube-api-access-b5wkk") pod "b7a3bde7-a988-429e-a74a-bc06d59143a4" (UID: "b7a3bde7-a988-429e-a74a-bc06d59143a4"). InnerVolumeSpecName "kube-api-access-b5wkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.244755 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7a3bde7-a988-429e-a74a-bc06d59143a4" (UID: "b7a3bde7-a988-429e-a74a-bc06d59143a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.323291 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5wkk\" (UniqueName: \"kubernetes.io/projected/b7a3bde7-a988-429e-a74a-bc06d59143a4-kube-api-access-b5wkk\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.323686 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.323717 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a3bde7-a988-429e-a74a-bc06d59143a4-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.646933 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerStarted","Data":"df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec"} Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.648884 4907 generic.go:334] "Generic (PLEG): container finished" podID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerID="836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25" exitCode=0 Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.648935 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfqpl" event={"ID":"b7a3bde7-a988-429e-a74a-bc06d59143a4","Type":"ContainerDied","Data":"836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25"} Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.648968 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sfqpl" event={"ID":"b7a3bde7-a988-429e-a74a-bc06d59143a4","Type":"ContainerDied","Data":"525a38d68865c0eceff4fccad499b56abee41951a6d4d3b4c135d3ca342ca251"} Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.649001 4907 scope.go:117] "RemoveContainer" containerID="836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.649212 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sfqpl" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.662266 4907 scope.go:117] "RemoveContainer" containerID="1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.679939 4907 scope.go:117] "RemoveContainer" containerID="bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.704233 4907 scope.go:117] "RemoveContainer" containerID="836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.704332 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfqpl"] Feb 26 15:59:37 crc kubenswrapper[4907]: E0226 15:59:37.709148 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25\": container with ID starting with 836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25 not found: ID does not exist" containerID="836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.709210 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25"} err="failed to get container status \"836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25\": rpc error: code = NotFound desc = could not find container \"836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25\": container with ID starting with 836b6141e1e7c843de40f94acabd601186ceaf30ca261ee57f39d690fa032e25 not found: ID does not exist" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.709244 4907 scope.go:117] "RemoveContainer" containerID="1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.710468 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sfqpl"] Feb 26 15:59:37 crc kubenswrapper[4907]: E0226 15:59:37.712421 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717\": container with ID starting with 1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717 not found: ID does not exist" containerID="1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.712461 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717"} err="failed to get container status \"1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717\": rpc error: code = NotFound desc = could not find container \"1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717\": container with ID starting with 1b3d3592d094d0994abd2883961c225438189e3e1847c8070d616b5904cfc717 not found: ID does not exist" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.712486 4907 scope.go:117] "RemoveContainer" containerID="bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011" Feb 26 15:59:37 crc kubenswrapper[4907]: E0226 15:59:37.712822 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011\": container with ID starting with bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011 not found: ID does not exist" containerID="bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011" Feb 26 15:59:37 crc kubenswrapper[4907]: I0226 15:59:37.712843 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011"} err="failed to get container status \"bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011\": rpc error: code = NotFound desc = could not find container \"bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011\": container with ID starting with bcc575f2079d596184b5cdd9b49d3cb54bdb72030bcd46bd523f2431abd52011 not found: ID does not exist" Feb 26 15:59:38 crc kubenswrapper[4907]: I0226 15:59:38.134640 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" path="/var/lib/kubelet/pods/b7a3bde7-a988-429e-a74a-bc06d59143a4/volumes" Feb 26 15:59:38 crc kubenswrapper[4907]: I0226 15:59:38.659050 4907 generic.go:334] "Generic (PLEG): container finished" podID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerID="df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec" exitCode=0 Feb 26 15:59:38 crc kubenswrapper[4907]: I0226 15:59:38.659209 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerDied","Data":"df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec"} Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383378 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j"] Feb 26 15:59:39 crc kubenswrapper[4907]: E0226 15:59:39.383657 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="extract-utilities" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383672 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="extract-utilities" Feb 26 15:59:39 crc kubenswrapper[4907]: E0226 15:59:39.383686 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="extract-content" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383695 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="extract-content" Feb 26 15:59:39 crc kubenswrapper[4907]: E0226 15:59:39.383706 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="util" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383714 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="util" Feb 26 15:59:39 crc kubenswrapper[4907]: E0226 15:59:39.383735 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="extract" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383744 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="extract" Feb 26 15:59:39 crc kubenswrapper[4907]: E0226 15:59:39.383764 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="registry-server" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383772 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="registry-server" Feb 26 15:59:39 crc kubenswrapper[4907]: E0226 15:59:39.383785 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="pull" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383792 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="pull" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.383996 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a3bde7-a988-429e-a74a-bc06d59143a4" containerName="registry-server" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.384017 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2e47e7-4347-4860-8c91-5a2b12ae1066" containerName="extract" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.384496 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.387458 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-qp2zj" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.420775 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j"] Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.455786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fcj\" (UniqueName: \"kubernetes.io/projected/76bf7541-fa3f-471d-8a14-99300afab6c1-kube-api-access-d5fcj\") pod \"openstack-operator-controller-init-66fc5dfc5b-4l68j\" (UID: \"76bf7541-fa3f-471d-8a14-99300afab6c1\") " pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.556542 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fcj\" (UniqueName: \"kubernetes.io/projected/76bf7541-fa3f-471d-8a14-99300afab6c1-kube-api-access-d5fcj\") pod \"openstack-operator-controller-init-66fc5dfc5b-4l68j\" (UID: \"76bf7541-fa3f-471d-8a14-99300afab6c1\") " pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.584739 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fcj\" (UniqueName: \"kubernetes.io/projected/76bf7541-fa3f-471d-8a14-99300afab6c1-kube-api-access-d5fcj\") pod \"openstack-operator-controller-init-66fc5dfc5b-4l68j\" (UID: \"76bf7541-fa3f-471d-8a14-99300afab6c1\") " pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.670565 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerStarted","Data":"670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4"} Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.690226 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g5kc9" podStartSLOduration=3.014428695 podStartE2EDuration="5.690209623s" podCreationTimestamp="2026-02-26 15:59:34 +0000 UTC" firstStartedPulling="2026-02-26 15:59:36.641733005 +0000 UTC m=+1039.160294854" lastFinishedPulling="2026-02-26 15:59:39.317513923 +0000 UTC m=+1041.836075782" observedRunningTime="2026-02-26 15:59:39.686243506 +0000 UTC m=+1042.204805355" watchObservedRunningTime="2026-02-26 15:59:39.690209623 +0000 UTC m=+1042.208771472" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.702145 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 15:59:39 crc kubenswrapper[4907]: I0226 15:59:39.947700 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j"] Feb 26 15:59:40 crc kubenswrapper[4907]: I0226 15:59:40.678975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" event={"ID":"76bf7541-fa3f-471d-8a14-99300afab6c1","Type":"ContainerStarted","Data":"2ef79f20c3f94cbaae331bbc406df7c6e8087c606d7ec55c8e95bc9c2c66b331"} Feb 26 15:59:44 crc kubenswrapper[4907]: I0226 15:59:44.562772 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:44 crc kubenswrapper[4907]: I0226 15:59:44.563089 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:44 crc kubenswrapper[4907]: I0226 15:59:44.609983 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:44 crc kubenswrapper[4907]: I0226 15:59:44.744325 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:45 crc kubenswrapper[4907]: I0226 15:59:45.709018 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" event={"ID":"76bf7541-fa3f-471d-8a14-99300afab6c1","Type":"ContainerStarted","Data":"a4d5f0d26a4a3c534373c1cc7d44f213797f46485e34028b1181ef0f8bc99062"} Feb 26 15:59:45 crc kubenswrapper[4907]: I0226 15:59:45.741970 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" podStartSLOduration=1.6344377030000001 podStartE2EDuration="6.741952523s" podCreationTimestamp="2026-02-26 15:59:39 +0000 UTC" firstStartedPulling="2026-02-26 15:59:39.959359246 +0000 UTC m=+1042.477921095" lastFinishedPulling="2026-02-26 15:59:45.066874066 +0000 UTC m=+1047.585435915" observedRunningTime="2026-02-26 15:59:45.737201009 +0000 UTC m=+1048.255762878" watchObservedRunningTime="2026-02-26 15:59:45.741952523 +0000 UTC m=+1048.260514392" Feb 26 15:59:46 crc kubenswrapper[4907]: I0226 15:59:46.713791 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.013266 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g5kc9"] Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.013913 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g5kc9" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="registry-server" containerID="cri-o://670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4" gracePeriod=2 Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.433884 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.481403 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp2f4\" (UniqueName: \"kubernetes.io/projected/015f7bce-7a88-46a1-a851-3d1aad21abc8-kube-api-access-jp2f4\") pod \"015f7bce-7a88-46a1-a851-3d1aad21abc8\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.481453 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-catalog-content\") pod \"015f7bce-7a88-46a1-a851-3d1aad21abc8\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.481502 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-utilities\") pod \"015f7bce-7a88-46a1-a851-3d1aad21abc8\" (UID: \"015f7bce-7a88-46a1-a851-3d1aad21abc8\") " Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.482400 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-utilities" (OuterVolumeSpecName: "utilities") pod "015f7bce-7a88-46a1-a851-3d1aad21abc8" (UID: "015f7bce-7a88-46a1-a851-3d1aad21abc8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.488403 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015f7bce-7a88-46a1-a851-3d1aad21abc8-kube-api-access-jp2f4" (OuterVolumeSpecName: "kube-api-access-jp2f4") pod "015f7bce-7a88-46a1-a851-3d1aad21abc8" (UID: "015f7bce-7a88-46a1-a851-3d1aad21abc8"). InnerVolumeSpecName "kube-api-access-jp2f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.543415 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "015f7bce-7a88-46a1-a851-3d1aad21abc8" (UID: "015f7bce-7a88-46a1-a851-3d1aad21abc8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.583231 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp2f4\" (UniqueName: \"kubernetes.io/projected/015f7bce-7a88-46a1-a851-3d1aad21abc8-kube-api-access-jp2f4\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.583261 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.583270 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/015f7bce-7a88-46a1-a851-3d1aad21abc8-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.724338 4907 generic.go:334] "Generic (PLEG): container finished" podID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerID="670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4" exitCode=0 Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.724419 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g5kc9" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.724426 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerDied","Data":"670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4"} Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.725424 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g5kc9" event={"ID":"015f7bce-7a88-46a1-a851-3d1aad21abc8","Type":"ContainerDied","Data":"5c4559ecc4ecabe32219bca5ea1324b4885529a60dd4c02cde971044aa268a0c"} Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.725465 4907 scope.go:117] "RemoveContainer" containerID="670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.744315 4907 scope.go:117] "RemoveContainer" containerID="df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.777138 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g5kc9"] Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.781668 4907 scope.go:117] "RemoveContainer" containerID="63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.791504 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g5kc9"] Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.802494 4907 scope.go:117] "RemoveContainer" containerID="670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4" Feb 26 15:59:47 crc kubenswrapper[4907]: E0226 15:59:47.803988 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4\": container with ID starting with 670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4 not found: ID does not exist" containerID="670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.804028 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4"} err="failed to get container status \"670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4\": rpc error: code = NotFound desc = could not find container \"670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4\": container with ID starting with 670cb4dae5e8342f3b11f8cda1996eef8bb4bdde3a3604869be4f1085dfde3b4 not found: ID does not exist" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.804055 4907 scope.go:117] "RemoveContainer" containerID="df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec" Feb 26 15:59:47 crc kubenswrapper[4907]: E0226 15:59:47.804296 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec\": container with ID starting with df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec not found: ID does not exist" containerID="df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.804316 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec"} err="failed to get container status \"df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec\": rpc error: code = NotFound desc = could not find container \"df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec\": container with ID starting with df81ac1eb953c99901fcb1c36c3dc7618e336844b1a46c5f28b1f9b46c7375ec not found: ID does not exist" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.804330 4907 scope.go:117] "RemoveContainer" containerID="63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1" Feb 26 15:59:47 crc kubenswrapper[4907]: E0226 15:59:47.804558 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1\": container with ID starting with 63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1 not found: ID does not exist" containerID="63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1" Feb 26 15:59:47 crc kubenswrapper[4907]: I0226 15:59:47.804610 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1"} err="failed to get container status \"63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1\": rpc error: code = NotFound desc = could not find container \"63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1\": container with ID starting with 63af04d57ed9cc39f271afa43a24190045e041335f968ae0654cebbc1b8a1aa1 not found: ID does not exist" Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.136517 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" path="/var/lib/kubelet/pods/015f7bce-7a88-46a1-a851-3d1aad21abc8/volumes" Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.531237 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.531303 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.531353 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.531943 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e579d2506f44ad3d5c29d72a7fa0d983bb32b89f28c090014c2276378479cce"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.531998 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://9e579d2506f44ad3d5c29d72a7fa0d983bb32b89f28c090014c2276378479cce" gracePeriod=600 Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.738393 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="9e579d2506f44ad3d5c29d72a7fa0d983bb32b89f28c090014c2276378479cce" exitCode=0 Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.738717 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"9e579d2506f44ad3d5c29d72a7fa0d983bb32b89f28c090014c2276378479cce"} Feb 26 15:59:48 crc kubenswrapper[4907]: I0226 15:59:48.738745 4907 scope.go:117] "RemoveContainer" containerID="135e9e11cfbaabe55bbe34848f747e715822492af89a2d18c459beb482f280c0" Feb 26 15:59:49 crc kubenswrapper[4907]: I0226 15:59:49.745887 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"2db300a26f9a65971b75abb9b1132aae00d9a358285f4cb580b858c6563b8062"} Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.104060 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8xhlb"] Feb 26 15:59:53 crc kubenswrapper[4907]: E0226 15:59:53.104808 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="extract-utilities" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.104822 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="extract-utilities" Feb 26 15:59:53 crc kubenswrapper[4907]: E0226 15:59:53.104837 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="extract-content" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.104843 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="extract-content" Feb 26 15:59:53 crc kubenswrapper[4907]: E0226 15:59:53.104852 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="registry-server" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.104858 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="registry-server" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.104982 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="015f7bce-7a88-46a1-a851-3d1aad21abc8" containerName="registry-server" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.105831 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.117940 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xhlb"] Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.163324 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-utilities\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.163430 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-catalog-content\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.163455 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbdft\" (UniqueName: \"kubernetes.io/projected/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-kube-api-access-cbdft\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.265115 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-utilities\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.265217 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-catalog-content\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.265239 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbdft\" (UniqueName: \"kubernetes.io/projected/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-kube-api-access-cbdft\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.265723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-utilities\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.266028 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-catalog-content\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.287065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbdft\" (UniqueName: \"kubernetes.io/projected/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-kube-api-access-cbdft\") pod \"certified-operators-8xhlb\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.458914 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 15:59:53 crc kubenswrapper[4907]: I0226 15:59:53.798793 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8xhlb"] Feb 26 15:59:54 crc kubenswrapper[4907]: I0226 15:59:54.779732 4907 generic.go:334] "Generic (PLEG): container finished" podID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerID="d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1" exitCode=0 Feb 26 15:59:54 crc kubenswrapper[4907]: I0226 15:59:54.779943 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerDied","Data":"d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1"} Feb 26 15:59:54 crc kubenswrapper[4907]: I0226 15:59:54.780801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerStarted","Data":"d19483c2b7b77c03252117dd8b9a04085a84e14536783f329795db0cc46defc9"} Feb 26 15:59:56 crc kubenswrapper[4907]: I0226 15:59:56.796696 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerStarted","Data":"04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3"} Feb 26 15:59:57 crc kubenswrapper[4907]: I0226 15:59:57.806723 4907 generic.go:334] "Generic (PLEG): container finished" podID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerID="04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3" exitCode=0 Feb 26 15:59:57 crc kubenswrapper[4907]: I0226 15:59:57.807102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerDied","Data":"04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3"} Feb 26 15:59:58 crc kubenswrapper[4907]: I0226 15:59:58.814445 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerStarted","Data":"d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963"} Feb 26 15:59:58 crc kubenswrapper[4907]: I0226 15:59:58.840652 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8xhlb" podStartSLOduration=2.4035727639999998 podStartE2EDuration="5.840631434s" podCreationTimestamp="2026-02-26 15:59:53 +0000 UTC" firstStartedPulling="2026-02-26 15:59:54.784787829 +0000 UTC m=+1057.303349678" lastFinishedPulling="2026-02-26 15:59:58.221846499 +0000 UTC m=+1060.740408348" observedRunningTime="2026-02-26 15:59:58.835078746 +0000 UTC m=+1061.353640615" watchObservedRunningTime="2026-02-26 15:59:58.840631434 +0000 UTC m=+1061.359193283" Feb 26 15:59:59 crc kubenswrapper[4907]: I0226 15:59:59.710212 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-66fc5dfc5b-4l68j" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.138230 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535360-drhn6"] Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.138975 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.141275 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.141332 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.144753 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft"] Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.145554 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.145781 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.148242 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.149454 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.151721 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-drhn6"] Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.176080 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft"] Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.189232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe0ded7-c52b-497d-8b97-d396cee606cf-config-volume\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.189280 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe0ded7-c52b-497d-8b97-d396cee606cf-secret-volume\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.189321 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twhc4\" (UniqueName: \"kubernetes.io/projected/e9bb11b5-c26b-4877-bb98-a7a5a22654d6-kube-api-access-twhc4\") pod \"auto-csr-approver-29535360-drhn6\" (UID: \"e9bb11b5-c26b-4877-bb98-a7a5a22654d6\") " pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.189363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/cfe0ded7-c52b-497d-8b97-d396cee606cf-kube-api-access-4lr6k\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.290893 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twhc4\" (UniqueName: \"kubernetes.io/projected/e9bb11b5-c26b-4877-bb98-a7a5a22654d6-kube-api-access-twhc4\") pod \"auto-csr-approver-29535360-drhn6\" (UID: \"e9bb11b5-c26b-4877-bb98-a7a5a22654d6\") " pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.291790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/cfe0ded7-c52b-497d-8b97-d396cee606cf-kube-api-access-4lr6k\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.291895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe0ded7-c52b-497d-8b97-d396cee606cf-config-volume\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.291933 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe0ded7-c52b-497d-8b97-d396cee606cf-secret-volume\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.293496 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe0ded7-c52b-497d-8b97-d396cee606cf-config-volume\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.306387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe0ded7-c52b-497d-8b97-d396cee606cf-secret-volume\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.308227 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twhc4\" (UniqueName: \"kubernetes.io/projected/e9bb11b5-c26b-4877-bb98-a7a5a22654d6-kube-api-access-twhc4\") pod \"auto-csr-approver-29535360-drhn6\" (UID: \"e9bb11b5-c26b-4877-bb98-a7a5a22654d6\") " pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.315766 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/cfe0ded7-c52b-497d-8b97-d396cee606cf-kube-api-access-4lr6k\") pod \"collect-profiles-29535360-wvhft\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.465202 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.474784 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.786691 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft"] Feb 26 16:00:00 crc kubenswrapper[4907]: W0226 16:00:00.795541 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfe0ded7_c52b_497d_8b97_d396cee606cf.slice/crio-5314e6e1e7fdf59a4e5a5841169e2ed45a81d1f117e1b8962128fc9aca0addd3 WatchSource:0}: Error finding container 5314e6e1e7fdf59a4e5a5841169e2ed45a81d1f117e1b8962128fc9aca0addd3: Status 404 returned error can't find the container with id 5314e6e1e7fdf59a4e5a5841169e2ed45a81d1f117e1b8962128fc9aca0addd3 Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.834169 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" event={"ID":"cfe0ded7-c52b-497d-8b97-d396cee606cf","Type":"ContainerStarted","Data":"5314e6e1e7fdf59a4e5a5841169e2ed45a81d1f117e1b8962128fc9aca0addd3"} Feb 26 16:00:00 crc kubenswrapper[4907]: I0226 16:00:00.850164 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-drhn6"] Feb 26 16:00:00 crc kubenswrapper[4907]: W0226 16:00:00.868773 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9bb11b5_c26b_4877_bb98_a7a5a22654d6.slice/crio-a8f7787277f6de5f6a7d2489754a7650e50691df740d11cb6728c94e9ee60ebd WatchSource:0}: Error finding container a8f7787277f6de5f6a7d2489754a7650e50691df740d11cb6728c94e9ee60ebd: Status 404 returned error can't find the container with id a8f7787277f6de5f6a7d2489754a7650e50691df740d11cb6728c94e9ee60ebd Feb 26 16:00:01 crc kubenswrapper[4907]: I0226 16:00:01.841639 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-drhn6" event={"ID":"e9bb11b5-c26b-4877-bb98-a7a5a22654d6","Type":"ContainerStarted","Data":"a8f7787277f6de5f6a7d2489754a7650e50691df740d11cb6728c94e9ee60ebd"} Feb 26 16:00:01 crc kubenswrapper[4907]: I0226 16:00:01.843624 4907 generic.go:334] "Generic (PLEG): container finished" podID="cfe0ded7-c52b-497d-8b97-d396cee606cf" containerID="ba102bcbd95042121729c4d4231b031bff4fbcbaf014d4a2ff4b599115138431" exitCode=0 Feb 26 16:00:01 crc kubenswrapper[4907]: I0226 16:00:01.843666 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" event={"ID":"cfe0ded7-c52b-497d-8b97-d396cee606cf","Type":"ContainerDied","Data":"ba102bcbd95042121729c4d4231b031bff4fbcbaf014d4a2ff4b599115138431"} Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.145760 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.233015 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe0ded7-c52b-497d-8b97-d396cee606cf-config-volume\") pod \"cfe0ded7-c52b-497d-8b97-d396cee606cf\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.233111 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe0ded7-c52b-497d-8b97-d396cee606cf-secret-volume\") pod \"cfe0ded7-c52b-497d-8b97-d396cee606cf\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.233288 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/cfe0ded7-c52b-497d-8b97-d396cee606cf-kube-api-access-4lr6k\") pod \"cfe0ded7-c52b-497d-8b97-d396cee606cf\" (UID: \"cfe0ded7-c52b-497d-8b97-d396cee606cf\") " Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.233479 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfe0ded7-c52b-497d-8b97-d396cee606cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "cfe0ded7-c52b-497d-8b97-d396cee606cf" (UID: "cfe0ded7-c52b-497d-8b97-d396cee606cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.233755 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfe0ded7-c52b-497d-8b97-d396cee606cf-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.270633 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfe0ded7-c52b-497d-8b97-d396cee606cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cfe0ded7-c52b-497d-8b97-d396cee606cf" (UID: "cfe0ded7-c52b-497d-8b97-d396cee606cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.270813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe0ded7-c52b-497d-8b97-d396cee606cf-kube-api-access-4lr6k" (OuterVolumeSpecName: "kube-api-access-4lr6k") pod "cfe0ded7-c52b-497d-8b97-d396cee606cf" (UID: "cfe0ded7-c52b-497d-8b97-d396cee606cf"). InnerVolumeSpecName "kube-api-access-4lr6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.334948 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cfe0ded7-c52b-497d-8b97-d396cee606cf-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.334986 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lr6k\" (UniqueName: \"kubernetes.io/projected/cfe0ded7-c52b-497d-8b97-d396cee606cf-kube-api-access-4lr6k\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.459399 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.459442 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.568111 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.857887 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-drhn6" event={"ID":"e9bb11b5-c26b-4877-bb98-a7a5a22654d6","Type":"ContainerStarted","Data":"22bd2d9f71b46a4b332ba0298c4dd9f15469626f8dd9e9a0f20e7e2952e083f9"} Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.859543 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.859543 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535360-wvhft" event={"ID":"cfe0ded7-c52b-497d-8b97-d396cee606cf","Type":"ContainerDied","Data":"5314e6e1e7fdf59a4e5a5841169e2ed45a81d1f117e1b8962128fc9aca0addd3"} Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.859620 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5314e6e1e7fdf59a4e5a5841169e2ed45a81d1f117e1b8962128fc9aca0addd3" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.876513 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535360-drhn6" podStartSLOduration=1.5764189530000001 podStartE2EDuration="3.87649479s" podCreationTimestamp="2026-02-26 16:00:00 +0000 UTC" firstStartedPulling="2026-02-26 16:00:00.870775992 +0000 UTC m=+1063.389337831" lastFinishedPulling="2026-02-26 16:00:03.170851819 +0000 UTC m=+1065.689413668" observedRunningTime="2026-02-26 16:00:03.875743092 +0000 UTC m=+1066.394304941" watchObservedRunningTime="2026-02-26 16:00:03.87649479 +0000 UTC m=+1066.395056639" Feb 26 16:00:03 crc kubenswrapper[4907]: I0226 16:00:03.927214 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 16:00:04 crc kubenswrapper[4907]: I0226 16:00:04.024069 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8xhlb"] Feb 26 16:00:04 crc kubenswrapper[4907]: I0226 16:00:04.867383 4907 generic.go:334] "Generic (PLEG): container finished" podID="e9bb11b5-c26b-4877-bb98-a7a5a22654d6" containerID="22bd2d9f71b46a4b332ba0298c4dd9f15469626f8dd9e9a0f20e7e2952e083f9" exitCode=0 Feb 26 16:00:04 crc kubenswrapper[4907]: I0226 16:00:04.867469 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-drhn6" event={"ID":"e9bb11b5-c26b-4877-bb98-a7a5a22654d6","Type":"ContainerDied","Data":"22bd2d9f71b46a4b332ba0298c4dd9f15469626f8dd9e9a0f20e7e2952e083f9"} Feb 26 16:00:05 crc kubenswrapper[4907]: I0226 16:00:05.872623 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8xhlb" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="registry-server" containerID="cri-o://d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963" gracePeriod=2 Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.328072 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.372908 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twhc4\" (UniqueName: \"kubernetes.io/projected/e9bb11b5-c26b-4877-bb98-a7a5a22654d6-kube-api-access-twhc4\") pod \"e9bb11b5-c26b-4877-bb98-a7a5a22654d6\" (UID: \"e9bb11b5-c26b-4877-bb98-a7a5a22654d6\") " Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.395122 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9bb11b5-c26b-4877-bb98-a7a5a22654d6-kube-api-access-twhc4" (OuterVolumeSpecName: "kube-api-access-twhc4") pod "e9bb11b5-c26b-4877-bb98-a7a5a22654d6" (UID: "e9bb11b5-c26b-4877-bb98-a7a5a22654d6"). InnerVolumeSpecName "kube-api-access-twhc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.474177 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twhc4\" (UniqueName: \"kubernetes.io/projected/e9bb11b5-c26b-4877-bb98-a7a5a22654d6-kube-api-access-twhc4\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.691400 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.776871 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-catalog-content\") pod \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.776971 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbdft\" (UniqueName: \"kubernetes.io/projected/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-kube-api-access-cbdft\") pod \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.776996 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-utilities\") pod \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\" (UID: \"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d\") " Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.777984 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-utilities" (OuterVolumeSpecName: "utilities") pod "91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" (UID: "91cbebe7-f2b7-4aae-8164-1df1a4e56e0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.780914 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-kube-api-access-cbdft" (OuterVolumeSpecName: "kube-api-access-cbdft") pod "91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" (UID: "91cbebe7-f2b7-4aae-8164-1df1a4e56e0d"). InnerVolumeSpecName "kube-api-access-cbdft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.841181 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" (UID: "91cbebe7-f2b7-4aae-8164-1df1a4e56e0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.878496 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.878835 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbdft\" (UniqueName: \"kubernetes.io/projected/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-kube-api-access-cbdft\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.878853 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.879740 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535360-drhn6" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.879759 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535360-drhn6" event={"ID":"e9bb11b5-c26b-4877-bb98-a7a5a22654d6","Type":"ContainerDied","Data":"a8f7787277f6de5f6a7d2489754a7650e50691df740d11cb6728c94e9ee60ebd"} Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.879795 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f7787277f6de5f6a7d2489754a7650e50691df740d11cb6728c94e9ee60ebd" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.884343 4907 generic.go:334] "Generic (PLEG): container finished" podID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerID="d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963" exitCode=0 Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.884383 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerDied","Data":"d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963"} Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.884408 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8xhlb" event={"ID":"91cbebe7-f2b7-4aae-8164-1df1a4e56e0d","Type":"ContainerDied","Data":"d19483c2b7b77c03252117dd8b9a04085a84e14536783f329795db0cc46defc9"} Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.884421 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8xhlb" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.884425 4907 scope.go:117] "RemoveContainer" containerID="d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.913307 4907 scope.go:117] "RemoveContainer" containerID="04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.929563 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8xhlb"] Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.938361 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8xhlb"] Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.949402 4907 scope.go:117] "RemoveContainer" containerID="d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.961444 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-x5ltf"] Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.965652 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535354-x5ltf"] Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.967657 4907 scope.go:117] "RemoveContainer" containerID="d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963" Feb 26 16:00:06 crc kubenswrapper[4907]: E0226 16:00:06.967999 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963\": container with ID starting with d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963 not found: ID does not exist" containerID="d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.968042 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963"} err="failed to get container status \"d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963\": rpc error: code = NotFound desc = could not find container \"d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963\": container with ID starting with d7b91ba37b0c7e61217ccf5f609176bb09241e195925f00a51fe18fe99fe4963 not found: ID does not exist" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.968069 4907 scope.go:117] "RemoveContainer" containerID="04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3" Feb 26 16:00:06 crc kubenswrapper[4907]: E0226 16:00:06.968333 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3\": container with ID starting with 04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3 not found: ID does not exist" containerID="04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.968366 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3"} err="failed to get container status \"04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3\": rpc error: code = NotFound desc = could not find container \"04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3\": container with ID starting with 04c45b86d4a27d1ccadb4a271a503c6889f1a340d597742ca0cb6d19689fc5b3 not found: ID does not exist" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.968384 4907 scope.go:117] "RemoveContainer" containerID="d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1" Feb 26 16:00:06 crc kubenswrapper[4907]: E0226 16:00:06.968635 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1\": container with ID starting with d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1 not found: ID does not exist" containerID="d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1" Feb 26 16:00:06 crc kubenswrapper[4907]: I0226 16:00:06.968656 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1"} err="failed to get container status \"d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1\": rpc error: code = NotFound desc = could not find container \"d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1\": container with ID starting with d48d2e5e494d83c0e251463b374c7b84043c73b4684becb2d47ef217057e35e1 not found: ID does not exist" Feb 26 16:00:08 crc kubenswrapper[4907]: I0226 16:00:08.133388 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" path="/var/lib/kubelet/pods/91cbebe7-f2b7-4aae-8164-1df1a4e56e0d/volumes" Feb 26 16:00:08 crc kubenswrapper[4907]: I0226 16:00:08.133970 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f905c87c-9059-47e4-918a-b54f36ec1195" path="/var/lib/kubelet/pods/f905c87c-9059-47e4-918a-b54f36ec1195/volumes" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.252839 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr"] Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.253765 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="extract-content" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.253782 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="extract-content" Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.253801 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe0ded7-c52b-497d-8b97-d396cee606cf" containerName="collect-profiles" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.253809 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe0ded7-c52b-497d-8b97-d396cee606cf" containerName="collect-profiles" Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.253819 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="registry-server" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.253827 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="registry-server" Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.253836 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="extract-utilities" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.253844 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="extract-utilities" Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.253853 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9bb11b5-c26b-4877-bb98-a7a5a22654d6" containerName="oc" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.253860 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9bb11b5-c26b-4877-bb98-a7a5a22654d6" containerName="oc" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.254000 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9bb11b5-c26b-4877-bb98-a7a5a22654d6" containerName="oc" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.254011 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe0ded7-c52b-497d-8b97-d396cee606cf" containerName="collect-profiles" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.254025 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="91cbebe7-f2b7-4aae-8164-1df1a4e56e0d" containerName="registry-server" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.254544 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.259840 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.260734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.261935 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-wttt5" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.270275 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2lxrs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.271685 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.277820 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.282775 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.284036 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.289985 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vpksl" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.321342 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.322206 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.333164 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-56cdp" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.337657 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.338448 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.340744 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6ztw\" (UniqueName: \"kubernetes.io/projected/41934925-b8e2-4927-a9a6-07defdda378c-kube-api-access-g6ztw\") pod \"designate-operator-controller-manager-55cc45767f-nxx6j\" (UID: \"41934925-b8e2-4927-a9a6-07defdda378c\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.340796 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzjdg\" (UniqueName: \"kubernetes.io/projected/a9988ddc-f970-4dac-bcd0-92266f0c7494-kube-api-access-xzjdg\") pod \"cinder-operator-controller-manager-768c8b45bb-k4hzr\" (UID: \"a9988ddc-f970-4dac-bcd0-92266f0c7494\") " pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.340818 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmgpr\" (UniqueName: \"kubernetes.io/projected/7fc27253-f8a7-4b6c-b83a-d32cdadb162d-kube-api-access-rmgpr\") pod \"barbican-operator-controller-manager-c4b7d6946-58hjs\" (UID: \"7fc27253-f8a7-4b6c-b83a-d32cdadb162d\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.347615 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-99267" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.351739 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.352402 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.361785 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dprc8" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.367903 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.372647 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.378086 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.378811 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.382315 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t9gtw" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.382474 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.403655 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.414298 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.415037 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.416167 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.419318 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-88cnp" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.422422 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.429154 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.436907 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bkvmg" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.441890 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk225\" (UniqueName: \"kubernetes.io/projected/44c123c9-ac46-4afe-b6d8-773f70ecc033-kube-api-access-mk225\") pod \"heat-operator-controller-manager-9595d6797-m4jb4\" (UID: \"44c123c9-ac46-4afe-b6d8-773f70ecc033\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.441936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ljc9\" (UniqueName: \"kubernetes.io/projected/e57bde5d-eca0-458a-af67-2f45ce85c54f-kube-api-access-6ljc9\") pod \"glance-operator-controller-manager-7f748f8b74-q55xl\" (UID: \"e57bde5d-eca0-458a-af67-2f45ce85c54f\") " pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.441986 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6ztw\" (UniqueName: \"kubernetes.io/projected/41934925-b8e2-4927-a9a6-07defdda378c-kube-api-access-g6ztw\") pod \"designate-operator-controller-manager-55cc45767f-nxx6j\" (UID: \"41934925-b8e2-4927-a9a6-07defdda378c\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.442030 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.442057 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76tf6\" (UniqueName: \"kubernetes.io/projected/13df9f9f-0740-41d3-b193-0517c76d2830-kube-api-access-76tf6\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.442096 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzjdg\" (UniqueName: \"kubernetes.io/projected/a9988ddc-f970-4dac-bcd0-92266f0c7494-kube-api-access-xzjdg\") pod \"cinder-operator-controller-manager-768c8b45bb-k4hzr\" (UID: \"a9988ddc-f970-4dac-bcd0-92266f0c7494\") " pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.442128 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmgpr\" (UniqueName: \"kubernetes.io/projected/7fc27253-f8a7-4b6c-b83a-d32cdadb162d-kube-api-access-rmgpr\") pod \"barbican-operator-controller-manager-c4b7d6946-58hjs\" (UID: \"7fc27253-f8a7-4b6c-b83a-d32cdadb162d\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.442164 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cldh\" (UniqueName: \"kubernetes.io/projected/ac6b0a27-6eaf-4d88-af65-94c64180c950-kube-api-access-5cldh\") pod \"horizon-operator-controller-manager-54fb488b88-6hchw\" (UID: \"ac6b0a27-6eaf-4d88-af65-94c64180c950\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.455157 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.460298 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.464204 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-7km48" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.484451 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzjdg\" (UniqueName: \"kubernetes.io/projected/a9988ddc-f970-4dac-bcd0-92266f0c7494-kube-api-access-xzjdg\") pod \"cinder-operator-controller-manager-768c8b45bb-k4hzr\" (UID: \"a9988ddc-f970-4dac-bcd0-92266f0c7494\") " pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.489862 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.494333 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.501032 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.503499 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6ztw\" (UniqueName: \"kubernetes.io/projected/41934925-b8e2-4927-a9a6-07defdda378c-kube-api-access-g6ztw\") pod \"designate-operator-controller-manager-55cc45767f-nxx6j\" (UID: \"41934925-b8e2-4927-a9a6-07defdda378c\") " pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.523221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmgpr\" (UniqueName: \"kubernetes.io/projected/7fc27253-f8a7-4b6c-b83a-d32cdadb162d-kube-api-access-rmgpr\") pod \"barbican-operator-controller-manager-c4b7d6946-58hjs\" (UID: \"7fc27253-f8a7-4b6c-b83a-d32cdadb162d\") " pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.539184 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.540217 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545451 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8gc5\" (UniqueName: \"kubernetes.io/projected/9fb09a9c-025a-4bc0-81a0-c127fee3f6f3-kube-api-access-z8gc5\") pod \"manila-operator-controller-manager-76fd76856-pk8zs\" (UID: \"9fb09a9c-025a-4bc0-81a0-c127fee3f6f3\") " pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545488 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545510 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76tf6\" (UniqueName: \"kubernetes.io/projected/13df9f9f-0740-41d3-b193-0517c76d2830-kube-api-access-76tf6\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545552 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cldh\" (UniqueName: \"kubernetes.io/projected/ac6b0a27-6eaf-4d88-af65-94c64180c950-kube-api-access-5cldh\") pod \"horizon-operator-controller-manager-54fb488b88-6hchw\" (UID: \"ac6b0a27-6eaf-4d88-af65-94c64180c950\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545599 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq5ls\" (UniqueName: \"kubernetes.io/projected/142a17bc-42dd-41ab-a97c-21350948ca5d-kube-api-access-rq5ls\") pod \"ironic-operator-controller-manager-6494cdbf8f-2r2t2\" (UID: \"142a17bc-42dd-41ab-a97c-21350948ca5d\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545618 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w897r\" (UniqueName: \"kubernetes.io/projected/f7c1fe7a-3983-49ff-bcde-36338aadc657-kube-api-access-w897r\") pod \"keystone-operator-controller-manager-6c78d668d5-245bf\" (UID: \"f7c1fe7a-3983-49ff-bcde-36338aadc657\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545653 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk225\" (UniqueName: \"kubernetes.io/projected/44c123c9-ac46-4afe-b6d8-773f70ecc033-kube-api-access-mk225\") pod \"heat-operator-controller-manager-9595d6797-m4jb4\" (UID: \"44c123c9-ac46-4afe-b6d8-773f70ecc033\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.545669 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ljc9\" (UniqueName: \"kubernetes.io/projected/e57bde5d-eca0-458a-af67-2f45ce85c54f-kube-api-access-6ljc9\") pod \"glance-operator-controller-manager-7f748f8b74-q55xl\" (UID: \"e57bde5d-eca0-458a-af67-2f45ce85c54f\") " pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.545955 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.545992 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert podName:13df9f9f-0740-41d3-b193-0517c76d2830 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:18.045978459 +0000 UTC m=+1080.564540298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert") pod "infra-operator-controller-manager-66d6b5f488-g7cb4" (UID: "13df9f9f-0740-41d3-b193-0517c76d2830") : secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.554424 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.555397 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.560862 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-j7gn7" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.569396 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-cdlnj" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.569562 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.570333 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.571950 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-9t2g4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.580098 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.583519 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.607274 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.608027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.620055 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.620259 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76tf6\" (UniqueName: \"kubernetes.io/projected/13df9f9f-0740-41d3-b193-0517c76d2830-kube-api-access-76tf6\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.634888 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ljc9\" (UniqueName: \"kubernetes.io/projected/e57bde5d-eca0-458a-af67-2f45ce85c54f-kube-api-access-6ljc9\") pod \"glance-operator-controller-manager-7f748f8b74-q55xl\" (UID: \"e57bde5d-eca0-458a-af67-2f45ce85c54f\") " pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.635397 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cldh\" (UniqueName: \"kubernetes.io/projected/ac6b0a27-6eaf-4d88-af65-94c64180c950-kube-api-access-5cldh\") pod \"horizon-operator-controller-manager-54fb488b88-6hchw\" (UID: \"ac6b0a27-6eaf-4d88-af65-94c64180c950\") " pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.643445 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.644703 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk225\" (UniqueName: \"kubernetes.io/projected/44c123c9-ac46-4afe-b6d8-773f70ecc033-kube-api-access-mk225\") pod \"heat-operator-controller-manager-9595d6797-m4jb4\" (UID: \"44c123c9-ac46-4afe-b6d8-773f70ecc033\") " pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.647631 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.648175 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhg5t\" (UniqueName: \"kubernetes.io/projected/3c5efb12-7704-4d2a-9ea6-aa35436391ae-kube-api-access-nhg5t\") pod \"mariadb-operator-controller-manager-6dc9b6ff89-vtc25\" (UID: \"3c5efb12-7704-4d2a-9ea6-aa35436391ae\") " pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.648203 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klnrd\" (UniqueName: \"kubernetes.io/projected/9a69dc6a-4034-4e7d-8b6f-576ccd828cf6-kube-api-access-klnrd\") pod \"nova-operator-controller-manager-5d56fd956f-6znnd\" (UID: \"9a69dc6a-4034-4e7d-8b6f-576ccd828cf6\") " pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.648236 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8gc5\" (UniqueName: \"kubernetes.io/projected/9fb09a9c-025a-4bc0-81a0-c127fee3f6f3-kube-api-access-z8gc5\") pod \"manila-operator-controller-manager-76fd76856-pk8zs\" (UID: \"9fb09a9c-025a-4bc0-81a0-c127fee3f6f3\") " pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.648278 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc2q8\" (UniqueName: \"kubernetes.io/projected/25c72e04-6714-4c5b-a273-a21a1415c4ac-kube-api-access-fc2q8\") pod \"neutron-operator-controller-manager-54967dbbdf-24rjt\" (UID: \"25c72e04-6714-4c5b-a273-a21a1415c4ac\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.648320 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq5ls\" (UniqueName: \"kubernetes.io/projected/142a17bc-42dd-41ab-a97c-21350948ca5d-kube-api-access-rq5ls\") pod \"ironic-operator-controller-manager-6494cdbf8f-2r2t2\" (UID: \"142a17bc-42dd-41ab-a97c-21350948ca5d\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.648338 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w897r\" (UniqueName: \"kubernetes.io/projected/f7c1fe7a-3983-49ff-bcde-36338aadc657-kube-api-access-w897r\") pod \"keystone-operator-controller-manager-6c78d668d5-245bf\" (UID: \"f7c1fe7a-3983-49ff-bcde-36338aadc657\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.679137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8gc5\" (UniqueName: \"kubernetes.io/projected/9fb09a9c-025a-4bc0-81a0-c127fee3f6f3-kube-api-access-z8gc5\") pod \"manila-operator-controller-manager-76fd76856-pk8zs\" (UID: \"9fb09a9c-025a-4bc0-81a0-c127fee3f6f3\") " pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.681773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w897r\" (UniqueName: \"kubernetes.io/projected/f7c1fe7a-3983-49ff-bcde-36338aadc657-kube-api-access-w897r\") pod \"keystone-operator-controller-manager-6c78d668d5-245bf\" (UID: \"f7c1fe7a-3983-49ff-bcde-36338aadc657\") " pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.681822 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.682226 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.682315 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.700077 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq5ls\" (UniqueName: \"kubernetes.io/projected/142a17bc-42dd-41ab-a97c-21350948ca5d-kube-api-access-rq5ls\") pod \"ironic-operator-controller-manager-6494cdbf8f-2r2t2\" (UID: \"142a17bc-42dd-41ab-a97c-21350948ca5d\") " pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.711503 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.730037 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.733362 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9c4qs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.736804 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.750249 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc2q8\" (UniqueName: \"kubernetes.io/projected/25c72e04-6714-4c5b-a273-a21a1415c4ac-kube-api-access-fc2q8\") pod \"neutron-operator-controller-manager-54967dbbdf-24rjt\" (UID: \"25c72e04-6714-4c5b-a273-a21a1415c4ac\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.750370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhg5t\" (UniqueName: \"kubernetes.io/projected/3c5efb12-7704-4d2a-9ea6-aa35436391ae-kube-api-access-nhg5t\") pod \"mariadb-operator-controller-manager-6dc9b6ff89-vtc25\" (UID: \"3c5efb12-7704-4d2a-9ea6-aa35436391ae\") " pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.750404 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klnrd\" (UniqueName: \"kubernetes.io/projected/9a69dc6a-4034-4e7d-8b6f-576ccd828cf6-kube-api-access-klnrd\") pod \"nova-operator-controller-manager-5d56fd956f-6znnd\" (UID: \"9a69dc6a-4034-4e7d-8b6f-576ccd828cf6\") " pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.760747 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.762317 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.771084 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bpkhm" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.771295 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.787216 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.788782 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.792408 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.813441 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhg5t\" (UniqueName: \"kubernetes.io/projected/3c5efb12-7704-4d2a-9ea6-aa35436391ae-kube-api-access-nhg5t\") pod \"mariadb-operator-controller-manager-6dc9b6ff89-vtc25\" (UID: \"3c5efb12-7704-4d2a-9ea6-aa35436391ae\") " pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.813999 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc2q8\" (UniqueName: \"kubernetes.io/projected/25c72e04-6714-4c5b-a273-a21a1415c4ac-kube-api-access-fc2q8\") pod \"neutron-operator-controller-manager-54967dbbdf-24rjt\" (UID: \"25c72e04-6714-4c5b-a273-a21a1415c4ac\") " pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.816145 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ctfh5" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.816467 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klnrd\" (UniqueName: \"kubernetes.io/projected/9a69dc6a-4034-4e7d-8b6f-576ccd828cf6-kube-api-access-klnrd\") pod \"nova-operator-controller-manager-5d56fd956f-6znnd\" (UID: \"9a69dc6a-4034-4e7d-8b6f-576ccd828cf6\") " pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.820671 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.831884 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.843001 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.852777 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.852884 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzh64\" (UniqueName: \"kubernetes.io/projected/1bcfd62b-212e-4efc-b0be-f0542e186f07-kube-api-access-dzh64\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.852932 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8bjd\" (UniqueName: \"kubernetes.io/projected/51842918-6f0f-4599-b288-84c75e4390ad-kube-api-access-f8bjd\") pod \"ovn-operator-controller-manager-85c99d655-t27pd\" (UID: \"51842918-6f0f-4599-b288-84c75e4390ad\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.853015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjdqn\" (UniqueName: \"kubernetes.io/projected/5dac7dc1-cf0e-4962-956e-800b57e369e1-kube-api-access-mjdqn\") pod \"octavia-operator-controller-manager-77b8b67585-x8222\" (UID: \"5dac7dc1-cf0e-4962-956e-800b57e369e1\") " pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.882908 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.914195 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.931764 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd"] Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.960230 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzh64\" (UniqueName: \"kubernetes.io/projected/1bcfd62b-212e-4efc-b0be-f0542e186f07-kube-api-access-dzh64\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.960272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8bjd\" (UniqueName: \"kubernetes.io/projected/51842918-6f0f-4599-b288-84c75e4390ad-kube-api-access-f8bjd\") pod \"ovn-operator-controller-manager-85c99d655-t27pd\" (UID: \"51842918-6f0f-4599-b288-84c75e4390ad\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.960335 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjdqn\" (UniqueName: \"kubernetes.io/projected/5dac7dc1-cf0e-4962-956e-800b57e369e1-kube-api-access-mjdqn\") pod \"octavia-operator-controller-manager-77b8b67585-x8222\" (UID: \"5dac7dc1-cf0e-4962-956e-800b57e369e1\") " pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.960363 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.960477 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:17 crc kubenswrapper[4907]: E0226 16:00:17.960521 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert podName:1bcfd62b-212e-4efc-b0be-f0542e186f07 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:18.460507745 +0000 UTC m=+1080.979069594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" (UID: "1bcfd62b-212e-4efc-b0be-f0542e186f07") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.997705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8bjd\" (UniqueName: \"kubernetes.io/projected/51842918-6f0f-4599-b288-84c75e4390ad-kube-api-access-f8bjd\") pod \"ovn-operator-controller-manager-85c99d655-t27pd\" (UID: \"51842918-6f0f-4599-b288-84c75e4390ad\") " pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.998129 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzh64\" (UniqueName: \"kubernetes.io/projected/1bcfd62b-212e-4efc-b0be-f0542e186f07-kube-api-access-dzh64\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:17 crc kubenswrapper[4907]: I0226 16:00:17.999964 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.001642 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.007156 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n9jgl" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.011441 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.011709 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.055064 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjdqn\" (UniqueName: \"kubernetes.io/projected/5dac7dc1-cf0e-4962-956e-800b57e369e1-kube-api-access-mjdqn\") pod \"octavia-operator-controller-manager-77b8b67585-x8222\" (UID: \"5dac7dc1-cf0e-4962-956e-800b57e369e1\") " pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.060990 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j28p\" (UniqueName: \"kubernetes.io/projected/1bba2156-1275-4aa3-8eba-3ce7c3c85d72-kube-api-access-5j28p\") pod \"placement-operator-controller-manager-57bd55f9b7-mxbcg\" (UID: \"1bba2156-1275-4aa3-8eba-3ce7c3c85d72\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.061105 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.061214 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.061250 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert podName:13df9f9f-0740-41d3-b193-0517c76d2830 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:19.061238005 +0000 UTC m=+1081.579799854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert") pod "infra-operator-controller-manager-66d6b5f488-g7cb4" (UID: "13df9f9f-0740-41d3-b193-0517c76d2830") : secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.066437 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.076949 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.077626 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.078640 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.083386 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-c88x9" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.083724 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gr8l4" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.104246 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9c4qs" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.113031 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.139363 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.170764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bms45\" (UniqueName: \"kubernetes.io/projected/2c9290e8-c587-48aa-8ea2-66b772c9341c-kube-api-access-bms45\") pod \"telemetry-operator-controller-manager-56dc67d744-w7qpb\" (UID: \"2c9290e8-c587-48aa-8ea2-66b772c9341c\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.170853 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h29md\" (UniqueName: \"kubernetes.io/projected/311f46b9-23bf-49b6-a2a5-919c8e42c62a-kube-api-access-h29md\") pod \"swift-operator-controller-manager-79558bbfbf-g2mlp\" (UID: \"311f46b9-23bf-49b6-a2a5-919c8e42c62a\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.170881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j28p\" (UniqueName: \"kubernetes.io/projected/1bba2156-1275-4aa3-8eba-3ce7c3c85d72-kube-api-access-5j28p\") pod \"placement-operator-controller-manager-57bd55f9b7-mxbcg\" (UID: \"1bba2156-1275-4aa3-8eba-3ce7c3c85d72\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.217173 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.217800 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.217818 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.217882 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.218678 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.219116 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ctfh5" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.219511 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ctntp" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.219701 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.226085 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gcw4x" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.226734 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.236891 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j28p\" (UniqueName: \"kubernetes.io/projected/1bba2156-1275-4aa3-8eba-3ce7c3c85d72-kube-api-access-5j28p\") pod \"placement-operator-controller-manager-57bd55f9b7-mxbcg\" (UID: \"1bba2156-1275-4aa3-8eba-3ce7c3c85d72\") " pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.271714 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.272578 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bms45\" (UniqueName: \"kubernetes.io/projected/2c9290e8-c587-48aa-8ea2-66b772c9341c-kube-api-access-bms45\") pod \"telemetry-operator-controller-manager-56dc67d744-w7qpb\" (UID: \"2c9290e8-c587-48aa-8ea2-66b772c9341c\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.272688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s2s2\" (UniqueName: \"kubernetes.io/projected/872f261b-cbf5-47b6-99ce-ee5c0d9794a3-kube-api-access-6s2s2\") pod \"test-operator-controller-manager-8467ccb4c8-r6ndg\" (UID: \"872f261b-cbf5-47b6-99ce-ee5c0d9794a3\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.272733 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8qx\" (UniqueName: \"kubernetes.io/projected/edeb6783-da9a-4f17-8ebe-e234aeeb35fd-kube-api-access-hg8qx\") pod \"watcher-operator-controller-manager-76bcb69745-v2z8v\" (UID: \"edeb6783-da9a-4f17-8ebe-e234aeeb35fd\") " pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.272753 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h29md\" (UniqueName: \"kubernetes.io/projected/311f46b9-23bf-49b6-a2a5-919c8e42c62a-kube-api-access-h29md\") pod \"swift-operator-controller-manager-79558bbfbf-g2mlp\" (UID: \"311f46b9-23bf-49b6-a2a5-919c8e42c62a\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.327171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h29md\" (UniqueName: \"kubernetes.io/projected/311f46b9-23bf-49b6-a2a5-919c8e42c62a-kube-api-access-h29md\") pod \"swift-operator-controller-manager-79558bbfbf-g2mlp\" (UID: \"311f46b9-23bf-49b6-a2a5-919c8e42c62a\") " pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.330107 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bms45\" (UniqueName: \"kubernetes.io/projected/2c9290e8-c587-48aa-8ea2-66b772c9341c-kube-api-access-bms45\") pod \"telemetry-operator-controller-manager-56dc67d744-w7qpb\" (UID: \"2c9290e8-c587-48aa-8ea2-66b772c9341c\") " pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.376505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s2s2\" (UniqueName: \"kubernetes.io/projected/872f261b-cbf5-47b6-99ce-ee5c0d9794a3-kube-api-access-6s2s2\") pod \"test-operator-controller-manager-8467ccb4c8-r6ndg\" (UID: \"872f261b-cbf5-47b6-99ce-ee5c0d9794a3\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.376623 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8qx\" (UniqueName: \"kubernetes.io/projected/edeb6783-da9a-4f17-8ebe-e234aeeb35fd-kube-api-access-hg8qx\") pod \"watcher-operator-controller-manager-76bcb69745-v2z8v\" (UID: \"edeb6783-da9a-4f17-8ebe-e234aeeb35fd\") " pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.394378 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.396493 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.397889 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-n9jgl" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.399863 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.400266 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-knhz6" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.400447 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.401120 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.451291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s2s2\" (UniqueName: \"kubernetes.io/projected/872f261b-cbf5-47b6-99ce-ee5c0d9794a3-kube-api-access-6s2s2\") pod \"test-operator-controller-manager-8467ccb4c8-r6ndg\" (UID: \"872f261b-cbf5-47b6-99ce-ee5c0d9794a3\") " pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.480108 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.483797 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8qx\" (UniqueName: \"kubernetes.io/projected/edeb6783-da9a-4f17-8ebe-e234aeeb35fd-kube-api-access-hg8qx\") pod \"watcher-operator-controller-manager-76bcb69745-v2z8v\" (UID: \"edeb6783-da9a-4f17-8ebe-e234aeeb35fd\") " pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.497497 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.497766 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.497827 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert podName:1bcfd62b-212e-4efc-b0be-f0542e186f07 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:19.497811029 +0000 UTC m=+1082.016372878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" (UID: "1bcfd62b-212e-4efc-b0be-f0542e186f07") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.515298 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.603954 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.633273 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.633899 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.634048 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4627\" (UniqueName: \"kubernetes.io/projected/e8f0195b-740f-4219-a422-9b99f2841ee5-kube-api-access-t4627\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.634084 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.637369 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.638248 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.644518 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-d54ph" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.667972 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.671175 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9"] Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.738120 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.738342 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqpls\" (UniqueName: \"kubernetes.io/projected/876cfd39-7856-438c-923e-1eb89fae62b0-kube-api-access-hqpls\") pod \"rabbitmq-cluster-operator-manager-668c99d594-psxq9\" (UID: \"876cfd39-7856-438c-923e-1eb89fae62b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.738365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.738433 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4627\" (UniqueName: \"kubernetes.io/projected/e8f0195b-740f-4219-a422-9b99f2841ee5-kube-api-access-t4627\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.738830 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.739289 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:19.23885938 +0000 UTC m=+1081.757421229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "metrics-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.739400 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: E0226 16:00:18.739427 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:19.239417784 +0000 UTC m=+1081.757979633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "webhook-server-cert" not found Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.758162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4627\" (UniqueName: \"kubernetes.io/projected/e8f0195b-740f-4219-a422-9b99f2841ee5-kube-api-access-t4627\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.839987 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqpls\" (UniqueName: \"kubernetes.io/projected/876cfd39-7856-438c-923e-1eb89fae62b0-kube-api-access-hqpls\") pod \"rabbitmq-cluster-operator-manager-668c99d594-psxq9\" (UID: \"876cfd39-7856-438c-923e-1eb89fae62b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" Feb 26 16:00:18 crc kubenswrapper[4907]: I0226 16:00:18.870659 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqpls\" (UniqueName: \"kubernetes.io/projected/876cfd39-7856-438c-923e-1eb89fae62b0-kube-api-access-hqpls\") pod \"rabbitmq-cluster-operator-manager-668c99d594-psxq9\" (UID: \"876cfd39-7856-438c-923e-1eb89fae62b0\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.016864 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl"] Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.024199 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode57bde5d_eca0_458a_af67_2f45ce85c54f.slice/crio-9ed13b695f9a4276998a5d4b4984c7e3c019d3280c13597d4de75080791c238a WatchSource:0}: Error finding container 9ed13b695f9a4276998a5d4b4984c7e3c019d3280c13597d4de75080791c238a: Status 404 returned error can't find the container with id 9ed13b695f9a4276998a5d4b4984c7e3c019d3280c13597d4de75080791c238a Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.039566 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41934925_b8e2_4927_a9a6_07defdda378c.slice/crio-025f5cada1b8cb5d6c70b0ee9d26a9139d8720253ae2b54a25f3c6d6794b1c1e WatchSource:0}: Error finding container 025f5cada1b8cb5d6c70b0ee9d26a9139d8720253ae2b54a25f3c6d6794b1c1e: Status 404 returned error can't find the container with id 025f5cada1b8cb5d6c70b0ee9d26a9139d8720253ae2b54a25f3c6d6794b1c1e Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.039608 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.040735 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" event={"ID":"e57bde5d-eca0-458a-af67-2f45ce85c54f","Type":"ContainerStarted","Data":"9ed13b695f9a4276998a5d4b4984c7e3c019d3280c13597d4de75080791c238a"} Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.040960 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.060759 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr"] Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.067752 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9988ddc_f970_4dac_bcd0_92266f0c7494.slice/crio-3cafe5d4937d54fc6d34e1823a829ff624606e7ee46460cbc7d85f8cb464a6df WatchSource:0}: Error finding container 3cafe5d4937d54fc6d34e1823a829ff624606e7ee46460cbc7d85f8cb464a6df: Status 404 returned error can't find the container with id 3cafe5d4937d54fc6d34e1823a829ff624606e7ee46460cbc7d85f8cb464a6df Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.090537 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4"] Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.095489 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fc27253_f8a7_4b6c_b83a_d32cdadb162d.slice/crio-e444e6052428abd342d14705bdbc8a875371ef1c718c2a9639de784806dada31 WatchSource:0}: Error finding container e444e6052428abd342d14705bdbc8a875371ef1c718c2a9639de784806dada31: Status 404 returned error can't find the container with id e444e6052428abd342d14705bdbc8a875371ef1c718c2a9639de784806dada31 Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.097926 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c123c9_ac46_4afe_b6d8_773f70ecc033.slice/crio-b807109340c7e1708adc4e366ef43fb99475cb362eb0a35f9ced619e3cb7dead WatchSource:0}: Error finding container b807109340c7e1708adc4e366ef43fb99475cb362eb0a35f9ced619e3cb7dead: Status 404 returned error can't find the container with id b807109340c7e1708adc4e366ef43fb99475cb362eb0a35f9ced619e3cb7dead Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.100264 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.143208 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.143776 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.145310 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.145395 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert podName:13df9f9f-0740-41d3-b193-0517c76d2830 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:21.145381108 +0000 UTC m=+1083.663942957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert") pod "infra-operator-controller-manager-66d6b5f488-g7cb4" (UID: "13df9f9f-0740-41d3-b193-0517c76d2830") : secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.245319 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.245394 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.245531 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.245575 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:20.245561814 +0000 UTC m=+1082.764123663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "webhook-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.245875 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.245898 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:20.245890162 +0000 UTC m=+1082.764452011 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "metrics-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.263420 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.274219 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.286247 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd"] Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.288224 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51842918_6f0f_4599_b288_84c75e4390ad.slice/crio-f2b8d99a2317f228b352574ef09e80eec75778aea76c52ba8fd8ac6cc469db03 WatchSource:0}: Error finding container f2b8d99a2317f228b352574ef09e80eec75778aea76c52ba8fd8ac6cc469db03: Status 404 returned error can't find the container with id f2b8d99a2317f228b352574ef09e80eec75778aea76c52ba8fd8ac6cc469db03 Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.456101 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222"] Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.464811 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dac7dc1_cf0e_4962_956e_800b57e369e1.slice/crio-847e939673bc42494918000432a5564b959bb895456d77e5053243addc56e8d9 WatchSource:0}: Error finding container 847e939673bc42494918000432a5564b959bb895456d77e5053243addc56e8d9: Status 404 returned error can't find the container with id 847e939673bc42494918000432a5564b959bb895456d77e5053243addc56e8d9 Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.465947 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25"] Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.483849 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5efb12_7704_4d2a_9ea6_aa35436391ae.slice/crio-40b0f7f5515f527f3259d0e4db846ca424b9806e1eb04d6eed5568e5f387e051 WatchSource:0}: Error finding container 40b0f7f5515f527f3259d0e4db846ca424b9806e1eb04d6eed5568e5f387e051: Status 404 returned error can't find the container with id 40b0f7f5515f527f3259d0e4db846ca424b9806e1eb04d6eed5568e5f387e051 Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.487181 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.508272 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.521943 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.533735 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.538614 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.548059 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.548295 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.548340 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert podName:1bcfd62b-212e-4efc-b0be-f0542e186f07 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:21.548327117 +0000 UTC m=+1084.066888966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" (UID: "1bcfd62b-212e-4efc-b0be-f0542e186f07") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.548995 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bba2156_1275_4aa3_8eba_3ce7c3c85d72.slice/crio-7b229d06630833eb9db65ad2c0768c348eb6f32ff408997d8a048eb280200196 WatchSource:0}: Error finding container 7b229d06630833eb9db65ad2c0768c348eb6f32ff408997d8a048eb280200196: Status 404 returned error can't find the container with id 7b229d06630833eb9db65ad2c0768c348eb6f32ff408997d8a048eb280200196 Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.549200 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fc2q8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54967dbbdf-24rjt_openstack-operators(25c72e04-6714-4c5b-a273-a21a1415c4ac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.550787 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" podUID="25c72e04-6714-4c5b-a273-a21a1415c4ac" Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.553750 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb09a9c_025a_4bc0_81a0_c127fee3f6f3.slice/crio-3d99a34f7fb6570583e6ea914522088722cf31bdb7a0ad3d603b9aa28d98fd78 WatchSource:0}: Error finding container 3d99a34f7fb6570583e6ea914522088722cf31bdb7a0ad3d603b9aa28d98fd78: Status 404 returned error can't find the container with id 3d99a34f7fb6570583e6ea914522088722cf31bdb7a0ad3d603b9aa28d98fd78 Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.553824 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5j28p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-57bd55f9b7-mxbcg_openstack-operators(1bba2156-1275-4aa3-8eba-3ce7c3c85d72): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.556109 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" podUID="1bba2156-1275-4aa3-8eba-3ce7c3c85d72" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.578337 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:134ba6286a71d80b32e0acad212d905cbe6a87c8d7aebdca2dd4a7a9ce09e529,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-klnrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d56fd956f-6znnd_openstack-operators(9a69dc6a-4034-4e7d-8b6f-576ccd828cf6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.578467 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:a5396a8d7e5ca6ddabfa92744f0d4adab9de0bbe712e8cdab1bf13576b7ac8c8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z8gc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-76fd76856-pk8zs_openstack-operators(9fb09a9c-025a-4bc0-81a0-c127fee3f6f3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.579624 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" podUID="9fb09a9c-025a-4bc0-81a0-c127fee3f6f3" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.579641 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" podUID="9a69dc6a-4034-4e7d-8b6f-576ccd828cf6" Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.658648 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.674167 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.686030 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg"] Feb 26 16:00:19 crc kubenswrapper[4907]: I0226 16:00:19.692238 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v"] Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.693127 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6s2s2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-8467ccb4c8-r6ndg_openstack-operators(872f261b-cbf5-47b6-99ce-ee5c0d9794a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.694301 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" podUID="872f261b-cbf5-47b6-99ce-ee5c0d9794a3" Feb 26 16:00:19 crc kubenswrapper[4907]: W0226 16:00:19.700524 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedeb6783_da9a_4f17_8ebe_e234aeeb35fd.slice/crio-ca20fd59ee13a2d82dbf8e9ddf3d0031e59ffb300d348842980375d0e5ad9a66 WatchSource:0}: Error finding container ca20fd59ee13a2d82dbf8e9ddf3d0031e59ffb300d348842980375d0e5ad9a66: Status 404 returned error can't find the container with id ca20fd59ee13a2d82dbf8e9ddf3d0031e59ffb300d348842980375d0e5ad9a66 Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.707358 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:f33fc1f2e53ff4baa4e16b41f37aaa7273dcea0ef4b5a3949411e2f105c73e5e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hg8qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76bcb69745-v2z8v_openstack-operators(edeb6783-da9a-4f17-8ebe-e234aeeb35fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.708484 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" podUID="edeb6783-da9a-4f17-8ebe-e234aeeb35fd" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.710441 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bms45,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-56dc67d744-w7qpb_openstack-operators(2c9290e8-c587-48aa-8ea2-66b772c9341c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 16:00:19 crc kubenswrapper[4907]: E0226 16:00:19.711561 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" podUID="2c9290e8-c587-48aa-8ea2-66b772c9341c" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.052543 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" event={"ID":"44c123c9-ac46-4afe-b6d8-773f70ecc033","Type":"ContainerStarted","Data":"b807109340c7e1708adc4e366ef43fb99475cb362eb0a35f9ced619e3cb7dead"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.054701 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" event={"ID":"2c9290e8-c587-48aa-8ea2-66b772c9341c","Type":"ContainerStarted","Data":"3ceadedb13b4bca4374833f88302bfc2c82b3d9802b0101f5d6b55e613ecda85"} Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.056303 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" podUID="2c9290e8-c587-48aa-8ea2-66b772c9341c" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.057983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" event={"ID":"9a69dc6a-4034-4e7d-8b6f-576ccd828cf6","Type":"ContainerStarted","Data":"390f9beedd54c45566ef3aa555e3db4e62d55f53899ade79e0e81e99bd2d43a9"} Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.060117 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:134ba6286a71d80b32e0acad212d905cbe6a87c8d7aebdca2dd4a7a9ce09e529\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" podUID="9a69dc6a-4034-4e7d-8b6f-576ccd828cf6" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.064997 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" event={"ID":"edeb6783-da9a-4f17-8ebe-e234aeeb35fd","Type":"ContainerStarted","Data":"ca20fd59ee13a2d82dbf8e9ddf3d0031e59ffb300d348842980375d0e5ad9a66"} Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.067644 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:f33fc1f2e53ff4baa4e16b41f37aaa7273dcea0ef4b5a3949411e2f105c73e5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" podUID="edeb6783-da9a-4f17-8ebe-e234aeeb35fd" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.077555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" event={"ID":"ac6b0a27-6eaf-4d88-af65-94c64180c950","Type":"ContainerStarted","Data":"178dc6afc588ff1d91bf17c3f6b7c26e7e9654b8d599824ce35e976a84d2a1f3"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.088113 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" event={"ID":"3c5efb12-7704-4d2a-9ea6-aa35436391ae","Type":"ContainerStarted","Data":"40b0f7f5515f527f3259d0e4db846ca424b9806e1eb04d6eed5568e5f387e051"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.090308 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" event={"ID":"25c72e04-6714-4c5b-a273-a21a1415c4ac","Type":"ContainerStarted","Data":"c8b5a4a0f66a9452f1646db9fb353730d79f8ba70b0269fc3dbddb82cfe754b5"} Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.092718 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" podUID="25c72e04-6714-4c5b-a273-a21a1415c4ac" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.094036 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" event={"ID":"311f46b9-23bf-49b6-a2a5-919c8e42c62a","Type":"ContainerStarted","Data":"d59267985933c290945d7a2d9365d0c96b589419481e9967438237819f66cb14"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.095616 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" event={"ID":"872f261b-cbf5-47b6-99ce-ee5c0d9794a3","Type":"ContainerStarted","Data":"14cf0d6830080b9466ebe3f5af10728939ea522266a66c561a4f28ebcef14fd7"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.106975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" event={"ID":"f7c1fe7a-3983-49ff-bcde-36338aadc657","Type":"ContainerStarted","Data":"52a7bcf1509e1ada7c01aa0ad937374508180643c572e50dc739e094be80bbf9"} Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.108284 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" podUID="872f261b-cbf5-47b6-99ce-ee5c0d9794a3" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.111847 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" event={"ID":"41934925-b8e2-4927-a9a6-07defdda378c","Type":"ContainerStarted","Data":"025f5cada1b8cb5d6c70b0ee9d26a9139d8720253ae2b54a25f3c6d6794b1c1e"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.115481 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" event={"ID":"5dac7dc1-cf0e-4962-956e-800b57e369e1","Type":"ContainerStarted","Data":"847e939673bc42494918000432a5564b959bb895456d77e5053243addc56e8d9"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.120720 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" event={"ID":"9fb09a9c-025a-4bc0-81a0-c127fee3f6f3","Type":"ContainerStarted","Data":"3d99a34f7fb6570583e6ea914522088722cf31bdb7a0ad3d603b9aa28d98fd78"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.121831 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" event={"ID":"876cfd39-7856-438c-923e-1eb89fae62b0","Type":"ContainerStarted","Data":"d2c1294ec8862afae51d340ca19f3e7b46931832b678c78f2e0e4f70ad450561"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.122983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" event={"ID":"51842918-6f0f-4599-b288-84c75e4390ad","Type":"ContainerStarted","Data":"f2b8d99a2317f228b352574ef09e80eec75778aea76c52ba8fd8ac6cc469db03"} Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.126927 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:a5396a8d7e5ca6ddabfa92744f0d4adab9de0bbe712e8cdab1bf13576b7ac8c8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" podUID="9fb09a9c-025a-4bc0-81a0-c127fee3f6f3" Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.156956 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" podUID="1bba2156-1275-4aa3-8eba-3ce7c3c85d72" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.181660 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" event={"ID":"142a17bc-42dd-41ab-a97c-21350948ca5d","Type":"ContainerStarted","Data":"800417a90fda36f559c2298246c6ca9f8622f57a762bba6b727ef28876804eb8"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.181719 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" event={"ID":"1bba2156-1275-4aa3-8eba-3ce7c3c85d72","Type":"ContainerStarted","Data":"7b229d06630833eb9db65ad2c0768c348eb6f32ff408997d8a048eb280200196"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.181751 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" event={"ID":"7fc27253-f8a7-4b6c-b83a-d32cdadb162d","Type":"ContainerStarted","Data":"e444e6052428abd342d14705bdbc8a875371ef1c718c2a9639de784806dada31"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.181765 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" event={"ID":"a9988ddc-f970-4dac-bcd0-92266f0c7494","Type":"ContainerStarted","Data":"3cafe5d4937d54fc6d34e1823a829ff624606e7ee46460cbc7d85f8cb464a6df"} Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.264024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:20 crc kubenswrapper[4907]: I0226 16:00:20.264316 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.264205 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.264763 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:22.264738845 +0000 UTC m=+1084.783300704 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "metrics-server-cert" not found Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.264855 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 16:00:20 crc kubenswrapper[4907]: E0226 16:00:20.264913 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:22.264897839 +0000 UTC m=+1084.783459688 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "webhook-server-cert" not found Feb 26 16:00:21 crc kubenswrapper[4907]: I0226 16:00:21.178970 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.179665 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.179716 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert podName:13df9f9f-0740-41d3-b193-0517c76d2830 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:25.179700699 +0000 UTC m=+1087.698262548 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert") pod "infra-operator-controller-manager-66d6b5f488-g7cb4" (UID: "13df9f9f-0740-41d3-b193-0517c76d2830") : secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.189900 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d800f1288d1517d84a45ddd475c3c0b4e8686fd900c9edf1e20b662b15218b89\\\"\"" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" podUID="1bba2156-1275-4aa3-8eba-3ce7c3c85d72" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.189960 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:a5396a8d7e5ca6ddabfa92744f0d4adab9de0bbe712e8cdab1bf13576b7ac8c8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" podUID="9fb09a9c-025a-4bc0-81a0-c127fee3f6f3" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.189992 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f9b2e00617c7f219932ea0d5e2bb795cc4361a335a72743077948d8108695c27\\\"\"" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" podUID="872f261b-cbf5-47b6-99ce-ee5c0d9794a3" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.190026 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:4b10e23983c3ec518c35aeabb33ac228063e56c81b4d7a100c5d91139ad7d7fc\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" podUID="2c9290e8-c587-48aa-8ea2-66b772c9341c" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.190062 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:134ba6286a71d80b32e0acad212d905cbe6a87c8d7aebdca2dd4a7a9ce09e529\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" podUID="9a69dc6a-4034-4e7d-8b6f-576ccd828cf6" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.190092 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:8d65a2becf279bb8b6b1a09e273d9a2cb1ff41f85bc42ef2e4d573cbb8cbac89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" podUID="25c72e04-6714-4c5b-a273-a21a1415c4ac" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.194757 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:f33fc1f2e53ff4baa4e16b41f37aaa7273dcea0ef4b5a3949411e2f105c73e5e\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" podUID="edeb6783-da9a-4f17-8ebe-e234aeeb35fd" Feb 26 16:00:21 crc kubenswrapper[4907]: I0226 16:00:21.590290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.590473 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:21 crc kubenswrapper[4907]: E0226 16:00:21.590550 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert podName:1bcfd62b-212e-4efc-b0be-f0542e186f07 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:25.590531715 +0000 UTC m=+1088.109093564 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" (UID: "1bcfd62b-212e-4efc-b0be-f0542e186f07") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:22 crc kubenswrapper[4907]: I0226 16:00:22.310673 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:22 crc kubenswrapper[4907]: I0226 16:00:22.310779 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:22 crc kubenswrapper[4907]: E0226 16:00:22.310949 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 16:00:22 crc kubenswrapper[4907]: E0226 16:00:22.311031 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:26.311012473 +0000 UTC m=+1088.829574322 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "webhook-server-cert" not found Feb 26 16:00:22 crc kubenswrapper[4907]: E0226 16:00:22.311358 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 16:00:22 crc kubenswrapper[4907]: E0226 16:00:22.311399 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:26.311391153 +0000 UTC m=+1088.829953002 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "metrics-server-cert" not found Feb 26 16:00:24 crc kubenswrapper[4907]: I0226 16:00:24.513999 4907 scope.go:117] "RemoveContainer" containerID="8807b9d17fb108cc008ef775f45367d18312cad71c70bb4b5cb43eafc391d7df" Feb 26 16:00:25 crc kubenswrapper[4907]: I0226 16:00:25.262053 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:25 crc kubenswrapper[4907]: E0226 16:00:25.262241 4907 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:25 crc kubenswrapper[4907]: E0226 16:00:25.262305 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert podName:13df9f9f-0740-41d3-b193-0517c76d2830 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:33.262287209 +0000 UTC m=+1095.780849068 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert") pod "infra-operator-controller-manager-66d6b5f488-g7cb4" (UID: "13df9f9f-0740-41d3-b193-0517c76d2830") : secret "infra-operator-webhook-server-cert" not found Feb 26 16:00:25 crc kubenswrapper[4907]: I0226 16:00:25.668354 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:25 crc kubenswrapper[4907]: E0226 16:00:25.668556 4907 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:25 crc kubenswrapper[4907]: E0226 16:00:25.668654 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert podName:1bcfd62b-212e-4efc-b0be-f0542e186f07 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:33.668636953 +0000 UTC m=+1096.187198802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert") pod "openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" (UID: "1bcfd62b-212e-4efc-b0be-f0542e186f07") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 16:00:26 crc kubenswrapper[4907]: I0226 16:00:26.377484 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:26 crc kubenswrapper[4907]: I0226 16:00:26.377846 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:26 crc kubenswrapper[4907]: E0226 16:00:26.377715 4907 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 16:00:26 crc kubenswrapper[4907]: E0226 16:00:26.377950 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:34.377933194 +0000 UTC m=+1096.896495043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "metrics-server-cert" not found Feb 26 16:00:26 crc kubenswrapper[4907]: E0226 16:00:26.377966 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 16:00:26 crc kubenswrapper[4907]: E0226 16:00:26.378013 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:34.378000016 +0000 UTC m=+1096.896561855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "webhook-server-cert" not found Feb 26 16:00:32 crc kubenswrapper[4907]: E0226 16:00:32.970120 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:00e0076b910b180d2ee76f7fa74f058fd1e2bee9e313f3a87c5f84bdd2600e2a" Feb 26 16:00:32 crc kubenswrapper[4907]: E0226 16:00:32.971899 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:00e0076b910b180d2ee76f7fa74f058fd1e2bee9e313f3a87c5f84bdd2600e2a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5cldh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-54fb488b88-6hchw_openstack-operators(ac6b0a27-6eaf-4d88-af65-94c64180c950): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:32 crc kubenswrapper[4907]: E0226 16:00:32.973123 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" podUID="ac6b0a27-6eaf-4d88-af65-94c64180c950" Feb 26 16:00:33 crc kubenswrapper[4907]: E0226 16:00:33.274759 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:00e0076b910b180d2ee76f7fa74f058fd1e2bee9e313f3a87c5f84bdd2600e2a\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" podUID="ac6b0a27-6eaf-4d88-af65-94c64180c950" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.287435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.302514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/13df9f9f-0740-41d3-b193-0517c76d2830-cert\") pod \"infra-operator-controller-manager-66d6b5f488-g7cb4\" (UID: \"13df9f9f-0740-41d3-b193-0517c76d2830\") " pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.319744 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t9gtw" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.328644 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.694366 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.706023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcfd62b-212e-4efc-b0be-f0542e186f07-cert\") pod \"openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt\" (UID: \"1bcfd62b-212e-4efc-b0be-f0542e186f07\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.751645 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bpkhm" Feb 26 16:00:33 crc kubenswrapper[4907]: I0226 16:00:33.760099 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:00:34 crc kubenswrapper[4907]: I0226 16:00:34.404689 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:34 crc kubenswrapper[4907]: I0226 16:00:34.404778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:34 crc kubenswrapper[4907]: E0226 16:00:34.404910 4907 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 16:00:34 crc kubenswrapper[4907]: E0226 16:00:34.405006 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs podName:e8f0195b-740f-4219-a422-9b99f2841ee5 nodeName:}" failed. No retries permitted until 2026-02-26 16:00:50.404983219 +0000 UTC m=+1112.923545078 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs") pod "openstack-operator-controller-manager-5c89c59655-dbrxn" (UID: "e8f0195b-740f-4219-a422-9b99f2841ee5") : secret "webhook-server-cert" not found Feb 26 16:00:34 crc kubenswrapper[4907]: I0226 16:00:34.408693 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-metrics-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:34 crc kubenswrapper[4907]: E0226 16:00:34.958662 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:ba1d61d3e7f9410541fa1d04110d3859b22fe7de9f2a570e932be8d0e312d5fe" Feb 26 16:00:34 crc kubenswrapper[4907]: E0226 16:00:34.958866 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:ba1d61d3e7f9410541fa1d04110d3859b22fe7de9f2a570e932be8d0e312d5fe,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mjdqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-77b8b67585-x8222_openstack-operators(5dac7dc1-cf0e-4962-956e-800b57e369e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:34 crc kubenswrapper[4907]: E0226 16:00:34.960044 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" podUID="5dac7dc1-cf0e-4962-956e-800b57e369e1" Feb 26 16:00:35 crc kubenswrapper[4907]: E0226 16:00:35.288206 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:ba1d61d3e7f9410541fa1d04110d3859b22fe7de9f2a570e932be8d0e312d5fe\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" podUID="5dac7dc1-cf0e-4962-956e-800b57e369e1" Feb 26 16:00:38 crc kubenswrapper[4907]: E0226 16:00:38.003751 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:6859cc0cf730f0780b90700447d91238a618c05465420960d07aa894abaf05e4" Feb 26 16:00:38 crc kubenswrapper[4907]: E0226 16:00:38.004151 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:6859cc0cf730f0780b90700447d91238a618c05465420960d07aa894abaf05e4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rq5ls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6494cdbf8f-2r2t2_openstack-operators(142a17bc-42dd-41ab-a97c-21350948ca5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:38 crc kubenswrapper[4907]: E0226 16:00:38.005317 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" podUID="142a17bc-42dd-41ab-a97c-21350948ca5d" Feb 26 16:00:38 crc kubenswrapper[4907]: E0226 16:00:38.306220 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:6859cc0cf730f0780b90700447d91238a618c05465420960d07aa894abaf05e4\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" podUID="142a17bc-42dd-41ab-a97c-21350948ca5d" Feb 26 16:00:42 crc kubenswrapper[4907]: E0226 16:00:42.628250 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:5007f87a2869468db06d6257c17e389b587a095a087466c69c0c92328e699546" Feb 26 16:00:42 crc kubenswrapper[4907]: E0226 16:00:42.628820 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:5007f87a2869468db06d6257c17e389b587a095a087466c69c0c92328e699546,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g6ztw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-55cc45767f-nxx6j_openstack-operators(41934925-b8e2-4927-a9a6-07defdda378c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:42 crc kubenswrapper[4907]: E0226 16:00:42.630405 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" podUID="41934925-b8e2-4927-a9a6-07defdda378c" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.261979 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:afb68925f208ca401020ca8b7812de075a77dafe3dc30fae5c095dcbe5acbc8a" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.262188 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:afb68925f208ca401020ca8b7812de075a77dafe3dc30fae5c095dcbe5acbc8a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mk225,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-9595d6797-m4jb4_openstack-operators(44c123c9-ac46-4afe-b6d8-773f70ecc033): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.263316 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" podUID="44c123c9-ac46-4afe-b6d8-773f70ecc033" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.340246 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:5007f87a2869468db06d6257c17e389b587a095a087466c69c0c92328e699546\\\"\"" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" podUID="41934925-b8e2-4927-a9a6-07defdda378c" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.340260 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:afb68925f208ca401020ca8b7812de075a77dafe3dc30fae5c095dcbe5acbc8a\\\"\"" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" podUID="44c123c9-ac46-4afe-b6d8-773f70ecc033" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.753730 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.753902 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h29md,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-79558bbfbf-g2mlp_openstack-operators(311f46b9-23bf-49b6-a2a5-919c8e42c62a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:43 crc kubenswrapper[4907]: E0226 16:00:43.759977 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" podUID="311f46b9-23bf-49b6-a2a5-919c8e42c62a" Feb 26 16:00:44 crc kubenswrapper[4907]: E0226 16:00:44.350225 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:015f7f2d8b5afc85e51dd3b2e02a4cfb8294b543437315b291006d2416764db9\\\"\"" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" podUID="311f46b9-23bf-49b6-a2a5-919c8e42c62a" Feb 26 16:00:45 crc kubenswrapper[4907]: E0226 16:00:45.660575 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:4d3b6d259005ea30eee9c134d5fdf3d67eaacad8568ed105a34674e510086816" Feb 26 16:00:45 crc kubenswrapper[4907]: E0226 16:00:45.661125 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:4d3b6d259005ea30eee9c134d5fdf3d67eaacad8568ed105a34674e510086816,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f8bjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-85c99d655-t27pd_openstack-operators(51842918-6f0f-4599-b288-84c75e4390ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:45 crc kubenswrapper[4907]: E0226 16:00:45.662322 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" podUID="51842918-6f0f-4599-b288-84c75e4390ad" Feb 26 16:00:46 crc kubenswrapper[4907]: E0226 16:00:46.375938 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:4d3b6d259005ea30eee9c134d5fdf3d67eaacad8568ed105a34674e510086816\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" podUID="51842918-6f0f-4599-b288-84c75e4390ad" Feb 26 16:00:47 crc kubenswrapper[4907]: E0226 16:00:47.956177 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Feb 26 16:00:47 crc kubenswrapper[4907]: E0226 16:00:47.956709 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hqpls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-psxq9_openstack-operators(876cfd39-7856-438c-923e-1eb89fae62b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:47 crc kubenswrapper[4907]: E0226 16:00:47.959559 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" podUID="876cfd39-7856-438c-923e-1eb89fae62b0" Feb 26 16:00:48 crc kubenswrapper[4907]: E0226 16:00:48.389189 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" podUID="876cfd39-7856-438c-923e-1eb89fae62b0" Feb 26 16:00:50 crc kubenswrapper[4907]: I0226 16:00:50.461580 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:50 crc kubenswrapper[4907]: I0226 16:00:50.482666 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e8f0195b-740f-4219-a422-9b99f2841ee5-webhook-certs\") pod \"openstack-operator-controller-manager-5c89c59655-dbrxn\" (UID: \"e8f0195b-740f-4219-a422-9b99f2841ee5\") " pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:50 crc kubenswrapper[4907]: I0226 16:00:50.569946 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:54 crc kubenswrapper[4907]: E0226 16:00:54.780187 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:a5396a8d7e5ca6ddabfa92744f0d4adab9de0bbe712e8cdab1bf13576b7ac8c8" Feb 26 16:00:54 crc kubenswrapper[4907]: E0226 16:00:54.780983 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:a5396a8d7e5ca6ddabfa92744f0d4adab9de0bbe712e8cdab1bf13576b7ac8c8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z8gc5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-76fd76856-pk8zs_openstack-operators(9fb09a9c-025a-4bc0-81a0-c127fee3f6f3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:00:54 crc kubenswrapper[4907]: E0226 16:00:54.782231 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" podUID="9fb09a9c-025a-4bc0-81a0-c127fee3f6f3" Feb 26 16:00:55 crc kubenswrapper[4907]: I0226 16:00:55.591200 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt"] Feb 26 16:00:55 crc kubenswrapper[4907]: W0226 16:00:55.646573 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bcfd62b_212e_4efc_b0be_f0542e186f07.slice/crio-7b5d99312916249096de6221672cfbcab66233b15295b1bf3924e32b1c65bb2a WatchSource:0}: Error finding container 7b5d99312916249096de6221672cfbcab66233b15295b1bf3924e32b1c65bb2a: Status 404 returned error can't find the container with id 7b5d99312916249096de6221672cfbcab66233b15295b1bf3924e32b1c65bb2a Feb 26 16:00:55 crc kubenswrapper[4907]: I0226 16:00:55.695818 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4"] Feb 26 16:00:55 crc kubenswrapper[4907]: W0226 16:00:55.742297 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13df9f9f_0740_41d3_b193_0517c76d2830.slice/crio-6f0ba93236f4b23b0f5e5e44975967696ff90b7c9e9c7b9f7e78743c76abb91e WatchSource:0}: Error finding container 6f0ba93236f4b23b0f5e5e44975967696ff90b7c9e9c7b9f7e78743c76abb91e: Status 404 returned error can't find the container with id 6f0ba93236f4b23b0f5e5e44975967696ff90b7c9e9c7b9f7e78743c76abb91e Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.020205 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn"] Feb 26 16:00:56 crc kubenswrapper[4907]: W0226 16:00:56.104755 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f0195b_740f_4219_a422_9b99f2841ee5.slice/crio-a9e61f8bdbbfa90871c09a9625f8c691990914c196f9f8ad41a02296664872f3 WatchSource:0}: Error finding container a9e61f8bdbbfa90871c09a9625f8c691990914c196f9f8ad41a02296664872f3: Status 404 returned error can't find the container with id a9e61f8bdbbfa90871c09a9625f8c691990914c196f9f8ad41a02296664872f3 Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.461180 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" event={"ID":"9a69dc6a-4034-4e7d-8b6f-576ccd828cf6","Type":"ContainerStarted","Data":"ab317770ece2df30b13092229261ee2b73d4b1c00dacf7a9bc04143a59bf9c36"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.472788 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" event={"ID":"e57bde5d-eca0-458a-af67-2f45ce85c54f","Type":"ContainerStarted","Data":"f9bb900505e124bedd97837979f95370ffe0bf0db3a8e0455ebf676bc6f3e76c"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.472862 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.485145 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" event={"ID":"f7c1fe7a-3983-49ff-bcde-36338aadc657","Type":"ContainerStarted","Data":"1b8fa7786772f42e0f4f4148b13dfc4beb23d6a9df1fde955604e6a33c741fd0"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.485913 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.491504 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" event={"ID":"edeb6783-da9a-4f17-8ebe-e234aeeb35fd","Type":"ContainerStarted","Data":"7ad118e4dcc33797350d55cfbedc98d418bc21d82bfd096ee44a3d92bca89207"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.492174 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.494014 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" podStartSLOduration=15.273399181 podStartE2EDuration="39.494001272s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.031642605 +0000 UTC m=+1081.550204454" lastFinishedPulling="2026-02-26 16:00:43.252244706 +0000 UTC m=+1105.770806545" observedRunningTime="2026-02-26 16:00:56.490130795 +0000 UTC m=+1119.008692644" watchObservedRunningTime="2026-02-26 16:00:56.494001272 +0000 UTC m=+1119.012563131" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.535924 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" podStartSLOduration=11.247956689 podStartE2EDuration="39.535904951s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.266805901 +0000 UTC m=+1081.785367750" lastFinishedPulling="2026-02-26 16:00:47.554754163 +0000 UTC m=+1110.073316012" observedRunningTime="2026-02-26 16:00:56.519470103 +0000 UTC m=+1119.038031952" watchObservedRunningTime="2026-02-26 16:00:56.535904951 +0000 UTC m=+1119.054466800" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.567863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" event={"ID":"a9988ddc-f970-4dac-bcd0-92266f0c7494","Type":"ContainerStarted","Data":"1fcff608228d8f1d966bdf177b91c31e9d7fab77cf09d34bbdd9ff27781ac6d3"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.568511 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.575709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" event={"ID":"5dac7dc1-cf0e-4962-956e-800b57e369e1","Type":"ContainerStarted","Data":"e881597fa20ae8be3612f02d23ac4dce3db234c6f96841a440c72e24e9fa2de1"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.576516 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.584152 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" podStartSLOduration=4.011994059 podStartE2EDuration="39.584123869s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.707233581 +0000 UTC m=+1082.225795430" lastFinishedPulling="2026-02-26 16:00:55.279363391 +0000 UTC m=+1117.797925240" observedRunningTime="2026-02-26 16:00:56.582100278 +0000 UTC m=+1119.100662127" watchObservedRunningTime="2026-02-26 16:00:56.584123869 +0000 UTC m=+1119.102685718" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.587142 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" event={"ID":"1bcfd62b-212e-4efc-b0be-f0542e186f07","Type":"ContainerStarted","Data":"7b5d99312916249096de6221672cfbcab66233b15295b1bf3924e32b1c65bb2a"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.589105 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" event={"ID":"e8f0195b-740f-4219-a422-9b99f2841ee5","Type":"ContainerStarted","Data":"a9e61f8bdbbfa90871c09a9625f8c691990914c196f9f8ad41a02296664872f3"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.589339 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.601017 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" event={"ID":"2c9290e8-c587-48aa-8ea2-66b772c9341c","Type":"ContainerStarted","Data":"2d2a81a32292a54d0608995242cc98d5a9dc28275e186f75a1a4133b4a6cf84b"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.601870 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.613847 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" event={"ID":"1bba2156-1275-4aa3-8eba-3ce7c3c85d72","Type":"ContainerStarted","Data":"bc4696b6bf831fc0e3b2e7c5aa2ad521a0c5b64114f8b6af4a5d20a7e967e01c"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.614688 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.617022 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" podStartSLOduration=13.050210392 podStartE2EDuration="39.617006854s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.07253515 +0000 UTC m=+1081.591097009" lastFinishedPulling="2026-02-26 16:00:45.639331582 +0000 UTC m=+1108.157893471" observedRunningTime="2026-02-26 16:00:56.614918843 +0000 UTC m=+1119.133480682" watchObservedRunningTime="2026-02-26 16:00:56.617006854 +0000 UTC m=+1119.135568703" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.632904 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" event={"ID":"3c5efb12-7704-4d2a-9ea6-aa35436391ae","Type":"ContainerStarted","Data":"ebd7ffac76b6b4e62ae1e21b7c5f0d32d8775f250f76a9063b35f4cd21b9a994"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.633377 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.645806 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" event={"ID":"142a17bc-42dd-41ab-a97c-21350948ca5d","Type":"ContainerStarted","Data":"c9bc7a36060f0b4d438b0fae1a1aaea2e484022bd3ad700dd9b192f53833daec"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.646780 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.660240 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" podStartSLOduration=3.48818623 podStartE2EDuration="39.660220827s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.503122495 +0000 UTC m=+1082.021684344" lastFinishedPulling="2026-02-26 16:00:55.675157092 +0000 UTC m=+1118.193718941" observedRunningTime="2026-02-26 16:00:56.657743915 +0000 UTC m=+1119.176305764" watchObservedRunningTime="2026-02-26 16:00:56.660220827 +0000 UTC m=+1119.178782676" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.665284 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" event={"ID":"7fc27253-f8a7-4b6c-b83a-d32cdadb162d","Type":"ContainerStarted","Data":"6c4709122f71993b1b9838da0e01177f9b28f7b1f05b244629ee5a9ea242508e"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.665960 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.685369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" event={"ID":"25c72e04-6714-4c5b-a273-a21a1415c4ac","Type":"ContainerStarted","Data":"a287068617cb1ae55ee33e881925a3252aa012c6affaa557f8105e8262611955"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.686014 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.706239 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" event={"ID":"ac6b0a27-6eaf-4d88-af65-94c64180c950","Type":"ContainerStarted","Data":"efac1304ba6e509eccc8c681ce62b1728e4119b80bd41541e146b3d18a975d20"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.706860 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.707933 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" event={"ID":"13df9f9f-0740-41d3-b193-0517c76d2830","Type":"ContainerStarted","Data":"6f0ba93236f4b23b0f5e5e44975967696ff90b7c9e9c7b9f7e78743c76abb91e"} Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.743971 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" podStartSLOduration=38.743949174 podStartE2EDuration="38.743949174s" podCreationTimestamp="2026-02-26 16:00:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:00:56.725092796 +0000 UTC m=+1119.243654645" watchObservedRunningTime="2026-02-26 16:00:56.743949174 +0000 UTC m=+1119.262511023" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.794387 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" podStartSLOduration=3.564283649 podStartE2EDuration="39.794364826s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.287497575 +0000 UTC m=+1081.806059424" lastFinishedPulling="2026-02-26 16:00:55.517578752 +0000 UTC m=+1118.036140601" observedRunningTime="2026-02-26 16:00:56.78566434 +0000 UTC m=+1119.304226199" watchObservedRunningTime="2026-02-26 16:00:56.794364826 +0000 UTC m=+1119.312926675" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.845079 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" podStartSLOduration=4.095449188 podStartE2EDuration="39.845058943s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.55368552 +0000 UTC m=+1082.072247369" lastFinishedPulling="2026-02-26 16:00:55.303295275 +0000 UTC m=+1117.821857124" observedRunningTime="2026-02-26 16:00:56.839106346 +0000 UTC m=+1119.357668185" watchObservedRunningTime="2026-02-26 16:00:56.845058943 +0000 UTC m=+1119.363620782" Feb 26 16:00:56 crc kubenswrapper[4907]: I0226 16:00:56.898278 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" podStartSLOduration=4.314788782 podStartE2EDuration="39.898261644s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.710328157 +0000 UTC m=+1082.228890006" lastFinishedPulling="2026-02-26 16:00:55.293801019 +0000 UTC m=+1117.812362868" observedRunningTime="2026-02-26 16:00:56.894777517 +0000 UTC m=+1119.413339366" watchObservedRunningTime="2026-02-26 16:00:56.898261644 +0000 UTC m=+1119.416823493" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.022581 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" podStartSLOduration=11.962372898 podStartE2EDuration="40.022565648s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.493518327 +0000 UTC m=+1082.012080186" lastFinishedPulling="2026-02-26 16:00:47.553711087 +0000 UTC m=+1110.072272936" observedRunningTime="2026-02-26 16:00:56.959545765 +0000 UTC m=+1119.478107614" watchObservedRunningTime="2026-02-26 16:00:57.022565648 +0000 UTC m=+1119.541127497" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.025276 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" podStartSLOduration=3.7943820969999997 podStartE2EDuration="40.025262975s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.157726604 +0000 UTC m=+1081.676288453" lastFinishedPulling="2026-02-26 16:00:55.388607482 +0000 UTC m=+1117.907169331" observedRunningTime="2026-02-26 16:00:57.017604475 +0000 UTC m=+1119.536166324" watchObservedRunningTime="2026-02-26 16:00:57.025262975 +0000 UTC m=+1119.543824824" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.073958 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" podStartSLOduration=11.617435159 podStartE2EDuration="40.073939323s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.097717575 +0000 UTC m=+1081.616279424" lastFinishedPulling="2026-02-26 16:00:47.554221739 +0000 UTC m=+1110.072783588" observedRunningTime="2026-02-26 16:00:57.067638267 +0000 UTC m=+1119.586200116" watchObservedRunningTime="2026-02-26 16:00:57.073939323 +0000 UTC m=+1119.592501172" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.135176 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" podStartSLOduration=4.399510934 podStartE2EDuration="40.135155922s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.549063906 +0000 UTC m=+1082.067625755" lastFinishedPulling="2026-02-26 16:00:55.284708894 +0000 UTC m=+1117.803270743" observedRunningTime="2026-02-26 16:00:57.12096288 +0000 UTC m=+1119.639524729" watchObservedRunningTime="2026-02-26 16:00:57.135155922 +0000 UTC m=+1119.653717771" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.724777 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" event={"ID":"41934925-b8e2-4927-a9a6-07defdda378c","Type":"ContainerStarted","Data":"3e9d2f750a787a5d6dfb345402fe080bb0651f143c793047e137eb8ed12a179f"} Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.725204 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.730905 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" event={"ID":"44c123c9-ac46-4afe-b6d8-773f70ecc033","Type":"ContainerStarted","Data":"97c177f34cd4f4d3e062773458d05aa2ce5a4e01525b67ecec9b9ef33b321fef"} Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.731558 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.737073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" event={"ID":"872f261b-cbf5-47b6-99ce-ee5c0d9794a3","Type":"ContainerStarted","Data":"0ca330d7615a4b7786ff07356ffa4e7be8ddc8ec9ecd78a42196b65c94d2b9b1"} Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.737368 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.743939 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" event={"ID":"e8f0195b-740f-4219-a422-9b99f2841ee5","Type":"ContainerStarted","Data":"d7985abea75e545dc5cc5dfe4aff716d6dbd454e58f786b6f365857dd5734d73"} Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.758844 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" podStartSLOduration=3.876823155 podStartE2EDuration="40.758821829s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.065885326 +0000 UTC m=+1081.584447165" lastFinishedPulling="2026-02-26 16:00:55.94788398 +0000 UTC m=+1118.466445839" observedRunningTime="2026-02-26 16:00:57.750006539 +0000 UTC m=+1120.268568388" watchObservedRunningTime="2026-02-26 16:00:57.758821829 +0000 UTC m=+1120.277383688" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.803973 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" podStartSLOduration=5.076166926 podStartE2EDuration="40.803958329s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.578188628 +0000 UTC m=+1082.096750477" lastFinishedPulling="2026-02-26 16:00:55.305980031 +0000 UTC m=+1117.824541880" observedRunningTime="2026-02-26 16:00:57.802193635 +0000 UTC m=+1120.320755494" watchObservedRunningTime="2026-02-26 16:00:57.803958329 +0000 UTC m=+1120.322520178" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.805859 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" podStartSLOduration=3.793369453 podStartE2EDuration="40.805847605s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.103743015 +0000 UTC m=+1081.622304854" lastFinishedPulling="2026-02-26 16:00:56.116221157 +0000 UTC m=+1118.634783006" observedRunningTime="2026-02-26 16:00:57.774900147 +0000 UTC m=+1120.293461996" watchObservedRunningTime="2026-02-26 16:00:57.805847605 +0000 UTC m=+1120.324409454" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.833007 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" podStartSLOduration=5.224507917 podStartE2EDuration="40.832981839s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.692959676 +0000 UTC m=+1082.211521525" lastFinishedPulling="2026-02-26 16:00:55.301433598 +0000 UTC m=+1117.819995447" observedRunningTime="2026-02-26 16:00:57.826217481 +0000 UTC m=+1120.344779330" watchObservedRunningTime="2026-02-26 16:00:57.832981839 +0000 UTC m=+1120.351543688" Feb 26 16:00:57 crc kubenswrapper[4907]: I0226 16:00:57.915405 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:00:58 crc kubenswrapper[4907]: I0226 16:00:58.771865 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" event={"ID":"311f46b9-23bf-49b6-a2a5-919c8e42c62a","Type":"ContainerStarted","Data":"7c23ef5ad23fd227bd5ab022ff6690da063c8e3ed41938b884ebe6e99fa76308"} Feb 26 16:00:58 crc kubenswrapper[4907]: I0226 16:00:58.775478 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:00:58 crc kubenswrapper[4907]: I0226 16:00:58.799921 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" podStartSLOduration=3.541300768 podStartE2EDuration="41.799906203s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.502866739 +0000 UTC m=+1082.021428588" lastFinishedPulling="2026-02-26 16:00:57.761472174 +0000 UTC m=+1120.280034023" observedRunningTime="2026-02-26 16:00:58.794202301 +0000 UTC m=+1121.312764150" watchObservedRunningTime="2026-02-26 16:00:58.799906203 +0000 UTC m=+1121.318468052" Feb 26 16:00:59 crc kubenswrapper[4907]: I0226 16:00:59.798753 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" event={"ID":"51842918-6f0f-4599-b288-84c75e4390ad","Type":"ContainerStarted","Data":"30535fe29d0acd5f329b9314f622e4bcebb30add00cf37570ebdbee6b903c1cf"} Feb 26 16:00:59 crc kubenswrapper[4907]: I0226 16:00:59.799095 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:00:59 crc kubenswrapper[4907]: I0226 16:00:59.823440 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" podStartSLOduration=3.4161099 podStartE2EDuration="42.823423601s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.293735549 +0000 UTC m=+1081.812297398" lastFinishedPulling="2026-02-26 16:00:58.70104925 +0000 UTC m=+1121.219611099" observedRunningTime="2026-02-26 16:00:59.820412277 +0000 UTC m=+1122.338974136" watchObservedRunningTime="2026-02-26 16:00:59.823423601 +0000 UTC m=+1122.341985450" Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.827694 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" event={"ID":"876cfd39-7856-438c-923e-1eb89fae62b0","Type":"ContainerStarted","Data":"517d55b2513111ce9ffcc76e8a77386294762567c5236547188cce36e089de07"} Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.830771 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" event={"ID":"1bcfd62b-212e-4efc-b0be-f0542e186f07","Type":"ContainerStarted","Data":"7e35e269c3f8ce93196a378b42dd84d9f20bf554f79cc5f1f88a003cf6853138"} Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.830875 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.832206 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" event={"ID":"13df9f9f-0740-41d3-b193-0517c76d2830","Type":"ContainerStarted","Data":"c40c6311e3af6782131fe368e81370dc5a8fd556bbb5ca8a3c961ac2df9477cf"} Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.832356 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.845648 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-psxq9" podStartSLOduration=1.8914860020000002 podStartE2EDuration="45.845626603s" podCreationTimestamp="2026-02-26 16:00:18 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.675468212 +0000 UTC m=+1082.194030061" lastFinishedPulling="2026-02-26 16:01:03.629608813 +0000 UTC m=+1126.148170662" observedRunningTime="2026-02-26 16:01:03.841790647 +0000 UTC m=+1126.360352486" watchObservedRunningTime="2026-02-26 16:01:03.845626603 +0000 UTC m=+1126.364188452" Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.903870 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" podStartSLOduration=39.527081732 podStartE2EDuration="46.903854907s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:55.744302948 +0000 UTC m=+1118.262864787" lastFinishedPulling="2026-02-26 16:01:03.121076073 +0000 UTC m=+1125.639637962" observedRunningTime="2026-02-26 16:01:03.901882829 +0000 UTC m=+1126.420444678" watchObservedRunningTime="2026-02-26 16:01:03.903854907 +0000 UTC m=+1126.422416756" Feb 26 16:01:03 crc kubenswrapper[4907]: I0226 16:01:03.905404 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" podStartSLOduration=39.464046538 podStartE2EDuration="46.905397336s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:55.660532769 +0000 UTC m=+1118.179094618" lastFinishedPulling="2026-02-26 16:01:03.101883567 +0000 UTC m=+1125.620445416" observedRunningTime="2026-02-26 16:01:03.884441066 +0000 UTC m=+1126.403002915" watchObservedRunningTime="2026-02-26 16:01:03.905397336 +0000 UTC m=+1126.423959175" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.587514 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-768c8b45bb-k4hzr" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.611198 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-c4b7d6946-58hjs" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.630055 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-55cc45767f-nxx6j" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.654386 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7f748f8b74-q55xl" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.687833 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-54fb488b88-6hchw" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.691184 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-9595d6797-m4jb4" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.739268 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6494cdbf8f-2r2t2" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.815151 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-6c78d668d5-245bf" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.886491 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6dc9b6ff89-vtc25" Feb 26 16:01:07 crc kubenswrapper[4907]: I0226 16:01:07.917790 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d56fd956f-6znnd" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.021450 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54967dbbdf-24rjt" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.117322 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-77b8b67585-x8222" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.231124 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-85c99d655-t27pd" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.404878 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-57bd55f9b7-mxbcg" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.483220 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-79558bbfbf-g2mlp" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.607074 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-56dc67d744-w7qpb" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.682227 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-8467ccb4c8-r6ndg" Feb 26 16:01:08 crc kubenswrapper[4907]: I0226 16:01:08.685698 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76bcb69745-v2z8v" Feb 26 16:01:10 crc kubenswrapper[4907]: E0226 16:01:10.128554 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:a5396a8d7e5ca6ddabfa92744f0d4adab9de0bbe712e8cdab1bf13576b7ac8c8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" podUID="9fb09a9c-025a-4bc0-81a0-c127fee3f6f3" Feb 26 16:01:10 crc kubenswrapper[4907]: I0226 16:01:10.576029 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5c89c59655-dbrxn" Feb 26 16:01:13 crc kubenswrapper[4907]: I0226 16:01:13.336039 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-66d6b5f488-g7cb4" Feb 26 16:01:13 crc kubenswrapper[4907]: I0226 16:01:13.769374 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt" Feb 26 16:01:24 crc kubenswrapper[4907]: I0226 16:01:24.992776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" event={"ID":"9fb09a9c-025a-4bc0-81a0-c127fee3f6f3","Type":"ContainerStarted","Data":"fbd4bd6c6f8b548040c65080ce82bdfffe22399633fc0391959582b19a1fada9"} Feb 26 16:01:24 crc kubenswrapper[4907]: I0226 16:01:24.993451 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:01:25 crc kubenswrapper[4907]: I0226 16:01:25.011133 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" podStartSLOduration=3.042711476 podStartE2EDuration="1m8.0111174s" podCreationTimestamp="2026-02-26 16:00:17 +0000 UTC" firstStartedPulling="2026-02-26 16:00:19.578402054 +0000 UTC m=+1082.096963903" lastFinishedPulling="2026-02-26 16:01:24.546807968 +0000 UTC m=+1147.065369827" observedRunningTime="2026-02-26 16:01:25.008648639 +0000 UTC m=+1147.527210498" watchObservedRunningTime="2026-02-26 16:01:25.0111174 +0000 UTC m=+1147.529679249" Feb 26 16:01:37 crc kubenswrapper[4907]: I0226 16:01:37.834693 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-76fd76856-pk8zs" Feb 26 16:01:48 crc kubenswrapper[4907]: I0226 16:01:48.530733 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:01:48 crc kubenswrapper[4907]: I0226 16:01:48.531306 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.265947 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-b2427"] Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.267910 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.272074 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.272113 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.272314 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.272470 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-7ds85" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.298525 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-b2427"] Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.329098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2f3aaf-955c-42d3-aa43-da773d92499a-config\") pod \"dnsmasq-dns-675f4bcbfc-b2427\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.329167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l84pn\" (UniqueName: \"kubernetes.io/projected/5e2f3aaf-955c-42d3-aa43-da773d92499a-kube-api-access-l84pn\") pod \"dnsmasq-dns-675f4bcbfc-b2427\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.415874 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49q42"] Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.417253 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.419497 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.430231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2f3aaf-955c-42d3-aa43-da773d92499a-config\") pod \"dnsmasq-dns-675f4bcbfc-b2427\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.430306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l84pn\" (UniqueName: \"kubernetes.io/projected/5e2f3aaf-955c-42d3-aa43-da773d92499a-kube-api-access-l84pn\") pod \"dnsmasq-dns-675f4bcbfc-b2427\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.431869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2f3aaf-955c-42d3-aa43-da773d92499a-config\") pod \"dnsmasq-dns-675f4bcbfc-b2427\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.481443 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l84pn\" (UniqueName: \"kubernetes.io/projected/5e2f3aaf-955c-42d3-aa43-da773d92499a-kube-api-access-l84pn\") pod \"dnsmasq-dns-675f4bcbfc-b2427\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.517449 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49q42"] Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.531265 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg58l\" (UniqueName: \"kubernetes.io/projected/d6a59f65-50b1-41f9-99df-2813f1c439bf-kube-api-access-qg58l\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.531616 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.531747 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-config\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.592458 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.633283 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg58l\" (UniqueName: \"kubernetes.io/projected/d6a59f65-50b1-41f9-99df-2813f1c439bf-kube-api-access-qg58l\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.633387 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.633422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-config\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.634482 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-config\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.634988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.652328 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg58l\" (UniqueName: \"kubernetes.io/projected/d6a59f65-50b1-41f9-99df-2813f1c439bf-kube-api-access-qg58l\") pod \"dnsmasq-dns-78dd6ddcc-49q42\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:58 crc kubenswrapper[4907]: I0226 16:01:58.735692 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:01:59 crc kubenswrapper[4907]: I0226 16:01:59.036308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49q42"] Feb 26 16:01:59 crc kubenswrapper[4907]: W0226 16:01:59.042420 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6a59f65_50b1_41f9_99df_2813f1c439bf.slice/crio-e8855212fbe7887e2ace10fbdcf150ff74139fb3805f91e8cee4f0f7a4c5e741 WatchSource:0}: Error finding container e8855212fbe7887e2ace10fbdcf150ff74139fb3805f91e8cee4f0f7a4c5e741: Status 404 returned error can't find the container with id e8855212fbe7887e2ace10fbdcf150ff74139fb3805f91e8cee4f0f7a4c5e741 Feb 26 16:01:59 crc kubenswrapper[4907]: I0226 16:01:59.095717 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-b2427"] Feb 26 16:01:59 crc kubenswrapper[4907]: W0226 16:01:59.103783 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2f3aaf_955c_42d3_aa43_da773d92499a.slice/crio-c864f5ab359ed36cea19b39a5a891ae293124e5e6c5b3fb1719b33f6e2b58504 WatchSource:0}: Error finding container c864f5ab359ed36cea19b39a5a891ae293124e5e6c5b3fb1719b33f6e2b58504: Status 404 returned error can't find the container with id c864f5ab359ed36cea19b39a5a891ae293124e5e6c5b3fb1719b33f6e2b58504 Feb 26 16:01:59 crc kubenswrapper[4907]: I0226 16:01:59.211875 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" event={"ID":"d6a59f65-50b1-41f9-99df-2813f1c439bf","Type":"ContainerStarted","Data":"e8855212fbe7887e2ace10fbdcf150ff74139fb3805f91e8cee4f0f7a4c5e741"} Feb 26 16:01:59 crc kubenswrapper[4907]: I0226 16:01:59.212851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" event={"ID":"5e2f3aaf-955c-42d3-aa43-da773d92499a","Type":"ContainerStarted","Data":"c864f5ab359ed36cea19b39a5a891ae293124e5e6c5b3fb1719b33f6e2b58504"} Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.137519 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535362-gch27"] Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.138473 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.140913 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-gch27"] Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.142057 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.143817 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.144084 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.172030 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjmlf\" (UniqueName: \"kubernetes.io/projected/1a0a1c21-7e2d-4053-b478-d6c6387f88d5-kube-api-access-wjmlf\") pod \"auto-csr-approver-29535362-gch27\" (UID: \"1a0a1c21-7e2d-4053-b478-d6c6387f88d5\") " pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.273297 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjmlf\" (UniqueName: \"kubernetes.io/projected/1a0a1c21-7e2d-4053-b478-d6c6387f88d5-kube-api-access-wjmlf\") pod \"auto-csr-approver-29535362-gch27\" (UID: \"1a0a1c21-7e2d-4053-b478-d6c6387f88d5\") " pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.314008 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjmlf\" (UniqueName: \"kubernetes.io/projected/1a0a1c21-7e2d-4053-b478-d6c6387f88d5-kube-api-access-wjmlf\") pod \"auto-csr-approver-29535362-gch27\" (UID: \"1a0a1c21-7e2d-4053-b478-d6c6387f88d5\") " pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.492036 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.957421 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-b2427"] Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.998146 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-glzzt"] Feb 26 16:02:00 crc kubenswrapper[4907]: I0226 16:02:00.999296 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.019389 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-glzzt"] Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.099521 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-dns-svc\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.099576 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6hdx\" (UniqueName: \"kubernetes.io/projected/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-kube-api-access-p6hdx\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.099692 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-config\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.181998 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-gch27"] Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.201862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6hdx\" (UniqueName: \"kubernetes.io/projected/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-kube-api-access-p6hdx\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.202004 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-config\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.202037 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-dns-svc\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.205043 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-config\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.205695 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-dns-svc\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.242618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-gch27" event={"ID":"1a0a1c21-7e2d-4053-b478-d6c6387f88d5","Type":"ContainerStarted","Data":"bcd7ebd631d509077ebeac269f4681696c6e2ea03021ee20af3ae8db33ceb410"} Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.246951 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6hdx\" (UniqueName: \"kubernetes.io/projected/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-kube-api-access-p6hdx\") pod \"dnsmasq-dns-666b6646f7-glzzt\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.340974 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.360960 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49q42"] Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.407380 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4pkdt"] Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.408878 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.426345 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4pkdt"] Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.509131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-config\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.509184 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.509292 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjjt\" (UniqueName: \"kubernetes.io/projected/0d202a81-23b7-45d1-847c-81375db1f908-kube-api-access-7bjjt\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.610339 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-config\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.610778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.610840 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjjt\" (UniqueName: \"kubernetes.io/projected/0d202a81-23b7-45d1-847c-81375db1f908-kube-api-access-7bjjt\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.611492 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.612118 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-config\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.641948 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjjt\" (UniqueName: \"kubernetes.io/projected/0d202a81-23b7-45d1-847c-81375db1f908-kube-api-access-7bjjt\") pod \"dnsmasq-dns-57d769cc4f-4pkdt\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:01 crc kubenswrapper[4907]: I0226 16:02:01.747901 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.006667 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-glzzt"] Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.187607 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.188938 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.189014 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.192063 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ptp6b" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.192279 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.195428 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.195610 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.195716 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.195856 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.195951 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237455 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-config-data\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237475 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96ba881c-449c-4300-b67f-8a1e952af508-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237503 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98k2n\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-kube-api-access-98k2n\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237517 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237531 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237646 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237671 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96ba881c-449c-4300-b67f-8a1e952af508-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.237716 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.253646 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" event={"ID":"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91","Type":"ContainerStarted","Data":"311d8973bfb95480ded0481cec6452568e7c60c1a186c233fc6f4b743fb84252"} Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.273847 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4pkdt"] Feb 26 16:02:02 crc kubenswrapper[4907]: W0226 16:02:02.285105 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d202a81_23b7_45d1_847c_81375db1f908.slice/crio-d2686a32fcdc47bdfa958b34ce2024f28a32c81ef797c262f375e32a1a24b1eb WatchSource:0}: Error finding container d2686a32fcdc47bdfa958b34ce2024f28a32c81ef797c262f375e32a1a24b1eb: Status 404 returned error can't find the container with id d2686a32fcdc47bdfa958b34ce2024f28a32c81ef797c262f375e32a1a24b1eb Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339447 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96ba881c-449c-4300-b67f-8a1e952af508-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339493 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339527 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339559 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339574 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-config-data\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339614 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96ba881c-449c-4300-b67f-8a1e952af508-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339637 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98k2n\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-kube-api-access-98k2n\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339653 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339668 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339710 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.339730 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.341101 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-config-data\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.341340 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.341941 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.343273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.343527 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.347021 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.347369 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.350360 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.359830 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96ba881c-449c-4300-b67f-8a1e952af508-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.363267 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96ba881c-449c-4300-b67f-8a1e952af508-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.363344 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98k2n\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-kube-api-access-98k2n\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.371236 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.533886 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.546338 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.547709 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557148 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557212 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557501 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557617 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557662 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kqxnc" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557502 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.557831 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.574642 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6lgm\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-kube-api-access-k6lgm\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646395 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646483 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646509 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646531 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646577 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cca4ff23-cabb-466c-80a0-dbcc1f005123-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646628 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646660 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cca4ff23-cabb-466c-80a0-dbcc1f005123-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646687 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.646800 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748692 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6lgm\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-kube-api-access-k6lgm\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748742 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748776 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748792 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748816 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748832 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cca4ff23-cabb-466c-80a0-dbcc1f005123-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748860 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748886 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cca4ff23-cabb-466c-80a0-dbcc1f005123-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748905 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748937 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.748967 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.749837 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.750168 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.751493 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.751685 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.752994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.753503 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.757623 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cca4ff23-cabb-466c-80a0-dbcc1f005123-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.760946 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.764894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.785298 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cca4ff23-cabb-466c-80a0-dbcc1f005123-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.785499 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6lgm\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-kube-api-access-k6lgm\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.809292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:02 crc kubenswrapper[4907]: I0226 16:02:02.883474 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.234604 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.296313 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" event={"ID":"0d202a81-23b7-45d1-847c-81375db1f908","Type":"ContainerStarted","Data":"d2686a32fcdc47bdfa958b34ce2024f28a32c81ef797c262f375e32a1a24b1eb"} Feb 26 16:02:03 crc kubenswrapper[4907]: W0226 16:02:03.328149 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96ba881c_449c_4300_b67f_8a1e952af508.slice/crio-f30b4ab8c4da28e28cac478e136dd20082245886273226ca75977b1b06a3ebe1 WatchSource:0}: Error finding container f30b4ab8c4da28e28cac478e136dd20082245886273226ca75977b1b06a3ebe1: Status 404 returned error can't find the container with id f30b4ab8c4da28e28cac478e136dd20082245886273226ca75977b1b06a3ebe1 Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.524333 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.760735 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.764190 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.767744 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.769891 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.769936 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9cx9d" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.770242 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.770262 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.782239 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.878854 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fdde055-1569-4b2a-bc9f-893b93ee63b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.878907 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qs6\" (UniqueName: \"kubernetes.io/projected/3fdde055-1569-4b2a-bc9f-893b93ee63b1-kube-api-access-g5qs6\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.878951 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.878976 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.878998 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fdde055-1569-4b2a-bc9f-893b93ee63b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.879013 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.879035 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.879061 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fdde055-1569-4b2a-bc9f-893b93ee63b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.987128 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fdde055-1569-4b2a-bc9f-893b93ee63b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.987180 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qs6\" (UniqueName: \"kubernetes.io/projected/3fdde055-1569-4b2a-bc9f-893b93ee63b1-kube-api-access-g5qs6\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.987224 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.987251 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.987274 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fdde055-1569-4b2a-bc9f-893b93ee63b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.987290 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.988275 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.988365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fdde055-1569-4b2a-bc9f-893b93ee63b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.988559 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.993686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-kolla-config\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.994042 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.994318 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3fdde055-1569-4b2a-bc9f-893b93ee63b1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:03 crc kubenswrapper[4907]: I0226 16:02:03.994851 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3fdde055-1569-4b2a-bc9f-893b93ee63b1-config-data-default\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.024388 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fdde055-1569-4b2a-bc9f-893b93ee63b1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.031243 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fdde055-1569-4b2a-bc9f-893b93ee63b1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.050157 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qs6\" (UniqueName: \"kubernetes.io/projected/3fdde055-1569-4b2a-bc9f-893b93ee63b1-kube-api-access-g5qs6\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.084313 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"3fdde055-1569-4b2a-bc9f-893b93ee63b1\") " pod="openstack/openstack-galera-0" Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.093474 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.309720 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96ba881c-449c-4300-b67f-8a1e952af508","Type":"ContainerStarted","Data":"f30b4ab8c4da28e28cac478e136dd20082245886273226ca75977b1b06a3ebe1"} Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.314891 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-gch27" event={"ID":"1a0a1c21-7e2d-4053-b478-d6c6387f88d5","Type":"ContainerStarted","Data":"35b2e94e854ed9ebd2c97ada2a337dec8a20b1c6c7318b2961b0aeccfa450544"} Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.321101 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cca4ff23-cabb-466c-80a0-dbcc1f005123","Type":"ContainerStarted","Data":"5558e16d18eb38160b895fd9b45060360bf46597d980c8545363832abe43461f"} Feb 26 16:02:04 crc kubenswrapper[4907]: I0226 16:02:04.861554 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.120049 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.121573 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.126526 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pntb4" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.126713 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.126775 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.126852 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.146440 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229085 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7af39e-1222-4a40-a2f3-a644e2ef477d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229147 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229174 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229191 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7af39e-1222-4a40-a2f3-a644e2ef477d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229517 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d7af39e-1222-4a40-a2f3-a644e2ef477d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229559 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk98t\" (UniqueName: \"kubernetes.io/projected/7d7af39e-1222-4a40-a2f3-a644e2ef477d-kube-api-access-sk98t\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.229949 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.230012 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331676 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7af39e-1222-4a40-a2f3-a644e2ef477d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331753 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7af39e-1222-4a40-a2f3-a644e2ef477d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331794 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d7af39e-1222-4a40-a2f3-a644e2ef477d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331812 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk98t\" (UniqueName: \"kubernetes.io/projected/7d7af39e-1222-4a40-a2f3-a644e2ef477d-kube-api-access-sk98t\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.331831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.332103 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.334725 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.335171 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7d7af39e-1222-4a40-a2f3-a644e2ef477d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.340949 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.342176 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7d7af39e-1222-4a40-a2f3-a644e2ef477d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.349670 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7af39e-1222-4a40-a2f3-a644e2ef477d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.364008 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7af39e-1222-4a40-a2f3-a644e2ef477d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.368925 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.370349 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fdde055-1569-4b2a-bc9f-893b93ee63b1","Type":"ContainerStarted","Data":"0946ce1d03a03b2fbd4cf2beea7864c92b018a553248bfe862b859429fc044ce"} Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.381643 4907 generic.go:334] "Generic (PLEG): container finished" podID="1a0a1c21-7e2d-4053-b478-d6c6387f88d5" containerID="35b2e94e854ed9ebd2c97ada2a337dec8a20b1c6c7318b2961b0aeccfa450544" exitCode=0 Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.381702 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-gch27" event={"ID":"1a0a1c21-7e2d-4053-b478-d6c6387f88d5","Type":"ContainerDied","Data":"35b2e94e854ed9ebd2c97ada2a337dec8a20b1c6c7318b2961b0aeccfa450544"} Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.413017 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk98t\" (UniqueName: \"kubernetes.io/projected/7d7af39e-1222-4a40-a2f3-a644e2ef477d-kube-api-access-sk98t\") pod \"openstack-cell1-galera-0\" (UID: \"7d7af39e-1222-4a40-a2f3-a644e2ef477d\") " pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.485675 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.519260 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.520388 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.526652 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.526651 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.526756 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-ghzrp" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.554438 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.635353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964032de-099d-4e22-95d5-d7acf78c5685-combined-ca-bundle\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.635436 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpdk\" (UniqueName: \"kubernetes.io/projected/964032de-099d-4e22-95d5-d7acf78c5685-kube-api-access-dwpdk\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.635571 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/964032de-099d-4e22-95d5-d7acf78c5685-kolla-config\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.635688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964032de-099d-4e22-95d5-d7acf78c5685-config-data\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.635783 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/964032de-099d-4e22-95d5-d7acf78c5685-memcached-tls-certs\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.738741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964032de-099d-4e22-95d5-d7acf78c5685-combined-ca-bundle\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.738816 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwpdk\" (UniqueName: \"kubernetes.io/projected/964032de-099d-4e22-95d5-d7acf78c5685-kube-api-access-dwpdk\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.738853 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/964032de-099d-4e22-95d5-d7acf78c5685-kolla-config\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.738897 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964032de-099d-4e22-95d5-d7acf78c5685-config-data\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.738941 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/964032de-099d-4e22-95d5-d7acf78c5685-memcached-tls-certs\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.740164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/964032de-099d-4e22-95d5-d7acf78c5685-config-data\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.741376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/964032de-099d-4e22-95d5-d7acf78c5685-kolla-config\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.757051 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/964032de-099d-4e22-95d5-d7acf78c5685-combined-ca-bundle\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.764394 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/964032de-099d-4e22-95d5-d7acf78c5685-memcached-tls-certs\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.772416 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwpdk\" (UniqueName: \"kubernetes.io/projected/964032de-099d-4e22-95d5-d7acf78c5685-kube-api-access-dwpdk\") pod \"memcached-0\" (UID: \"964032de-099d-4e22-95d5-d7acf78c5685\") " pod="openstack/memcached-0" Feb 26 16:02:05 crc kubenswrapper[4907]: I0226 16:02:05.955733 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 16:02:06 crc kubenswrapper[4907]: I0226 16:02:06.168754 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 16:02:06 crc kubenswrapper[4907]: W0226 16:02:06.192263 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d7af39e_1222_4a40_a2f3_a644e2ef477d.slice/crio-8022e48cc12a6f0197904e83b595b337dc80a14bc7642a13ffa22249c1948272 WatchSource:0}: Error finding container 8022e48cc12a6f0197904e83b595b337dc80a14bc7642a13ffa22249c1948272: Status 404 returned error can't find the container with id 8022e48cc12a6f0197904e83b595b337dc80a14bc7642a13ffa22249c1948272 Feb 26 16:02:06 crc kubenswrapper[4907]: I0226 16:02:06.397928 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7d7af39e-1222-4a40-a2f3-a644e2ef477d","Type":"ContainerStarted","Data":"8022e48cc12a6f0197904e83b595b337dc80a14bc7642a13ffa22249c1948272"} Feb 26 16:02:06 crc kubenswrapper[4907]: I0226 16:02:06.576132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 16:02:06 crc kubenswrapper[4907]: I0226 16:02:06.978248 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.093728 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjmlf\" (UniqueName: \"kubernetes.io/projected/1a0a1c21-7e2d-4053-b478-d6c6387f88d5-kube-api-access-wjmlf\") pod \"1a0a1c21-7e2d-4053-b478-d6c6387f88d5\" (UID: \"1a0a1c21-7e2d-4053-b478-d6c6387f88d5\") " Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.100513 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0a1c21-7e2d-4053-b478-d6c6387f88d5-kube-api-access-wjmlf" (OuterVolumeSpecName: "kube-api-access-wjmlf") pod "1a0a1c21-7e2d-4053-b478-d6c6387f88d5" (UID: "1a0a1c21-7e2d-4053-b478-d6c6387f88d5"). InnerVolumeSpecName "kube-api-access-wjmlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.196433 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjmlf\" (UniqueName: \"kubernetes.io/projected/1a0a1c21-7e2d-4053-b478-d6c6387f88d5-kube-api-access-wjmlf\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.415403 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"964032de-099d-4e22-95d5-d7acf78c5685","Type":"ContainerStarted","Data":"f778d89aa3d453cb9c3e4fa245c2bb6173a840d2ee3184e8db7171fc15d5394f"} Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.421862 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535362-gch27" event={"ID":"1a0a1c21-7e2d-4053-b478-d6c6387f88d5","Type":"ContainerDied","Data":"bcd7ebd631d509077ebeac269f4681696c6e2ea03021ee20af3ae8db33ceb410"} Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.421893 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcd7ebd631d509077ebeac269f4681696c6e2ea03021ee20af3ae8db33ceb410" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.421944 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535362-gch27" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.897062 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:02:07 crc kubenswrapper[4907]: E0226 16:02:07.903196 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0a1c21-7e2d-4053-b478-d6c6387f88d5" containerName="oc" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.903227 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0a1c21-7e2d-4053-b478-d6c6387f88d5" containerName="oc" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.903690 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0a1c21-7e2d-4053-b478-d6c6387f88d5" containerName="oc" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.905689 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.935893 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-h582z" Feb 26 16:02:07 crc kubenswrapper[4907]: I0226 16:02:07.942786 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.029163 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp248\" (UniqueName: \"kubernetes.io/projected/3623ea59-40fb-48f6-943d-1ea5fe3ad253-kube-api-access-zp248\") pod \"kube-state-metrics-0\" (UID: \"3623ea59-40fb-48f6-943d-1ea5fe3ad253\") " pod="openstack/kube-state-metrics-0" Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.065693 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-g8895"] Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.074538 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535356-g8895"] Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.135808 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp248\" (UniqueName: \"kubernetes.io/projected/3623ea59-40fb-48f6-943d-1ea5fe3ad253-kube-api-access-zp248\") pod \"kube-state-metrics-0\" (UID: \"3623ea59-40fb-48f6-943d-1ea5fe3ad253\") " pod="openstack/kube-state-metrics-0" Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.147001 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0a628a-4c00-4b3a-8710-9ac6a6880844" path="/var/lib/kubelet/pods/6d0a628a-4c00-4b3a-8710-9ac6a6880844/volumes" Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.152254 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp248\" (UniqueName: \"kubernetes.io/projected/3623ea59-40fb-48f6-943d-1ea5fe3ad253-kube-api-access-zp248\") pod \"kube-state-metrics-0\" (UID: \"3623ea59-40fb-48f6-943d-1ea5fe3ad253\") " pod="openstack/kube-state-metrics-0" Feb 26 16:02:08 crc kubenswrapper[4907]: I0226 16:02:08.230097 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:02:09 crc kubenswrapper[4907]: I0226 16:02:09.091698 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.755169 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-drng5"] Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.756662 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.766550 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.766610 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.766967 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nkr6q" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.792054 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9qr64"] Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.796522 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.856254 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-drng5"] Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.922875 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-run\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.922925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-log-ovn\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923003 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-run\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-run-ovn\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923097 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwcvf\" (UniqueName: \"kubernetes.io/projected/ce0f1161-6251-4318-b364-7db1779f93bd-kube-api-access-kwcvf\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-etc-ovs\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923208 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d3c733-f440-4877-9e7b-af62f5dc7857-ovn-controller-tls-certs\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923242 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-lib\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923290 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d3c733-f440-4877-9e7b-af62f5dc7857-scripts\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-log\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923347 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce0f1161-6251-4318-b364-7db1779f93bd-scripts\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923363 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t66tx\" (UniqueName: \"kubernetes.io/projected/66d3c733-f440-4877-9e7b-af62f5dc7857-kube-api-access-t66tx\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.923435 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d3c733-f440-4877-9e7b-af62f5dc7857-combined-ca-bundle\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:10 crc kubenswrapper[4907]: I0226 16:02:10.924625 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9qr64"] Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024714 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d3c733-f440-4877-9e7b-af62f5dc7857-ovn-controller-tls-certs\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024769 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-lib\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d3c733-f440-4877-9e7b-af62f5dc7857-scripts\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024841 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-log\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024859 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce0f1161-6251-4318-b364-7db1779f93bd-scripts\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024873 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t66tx\" (UniqueName: \"kubernetes.io/projected/66d3c733-f440-4877-9e7b-af62f5dc7857-kube-api-access-t66tx\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024904 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d3c733-f440-4877-9e7b-af62f5dc7857-combined-ca-bundle\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-run\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-log-ovn\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024970 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-run\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.024989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-run-ovn\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.025009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwcvf\" (UniqueName: \"kubernetes.io/projected/ce0f1161-6251-4318-b364-7db1779f93bd-kube-api-access-kwcvf\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.025029 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-etc-ovs\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.025521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-etc-ovs\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.030696 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/66d3c733-f440-4877-9e7b-af62f5dc7857-ovn-controller-tls-certs\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.030874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-run\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.030975 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-log-ovn\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.031016 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-run\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.031080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/66d3c733-f440-4877-9e7b-af62f5dc7857-var-run-ovn\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.031471 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-log\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.031577 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ce0f1161-6251-4318-b364-7db1779f93bd-var-lib\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.033529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/66d3c733-f440-4877-9e7b-af62f5dc7857-scripts\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.035514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce0f1161-6251-4318-b364-7db1779f93bd-scripts\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.057254 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66d3c733-f440-4877-9e7b-af62f5dc7857-combined-ca-bundle\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.069484 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwcvf\" (UniqueName: \"kubernetes.io/projected/ce0f1161-6251-4318-b364-7db1779f93bd-kube-api-access-kwcvf\") pod \"ovn-controller-ovs-9qr64\" (UID: \"ce0f1161-6251-4318-b364-7db1779f93bd\") " pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.087981 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t66tx\" (UniqueName: \"kubernetes.io/projected/66d3c733-f440-4877-9e7b-af62f5dc7857-kube-api-access-t66tx\") pod \"ovn-controller-drng5\" (UID: \"66d3c733-f440-4877-9e7b-af62f5dc7857\") " pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.093121 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.137267 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.965687 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.966973 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.969334 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-rqdjz" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.969362 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.969543 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.970824 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.971121 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 26 16:02:11 crc kubenswrapper[4907]: I0226 16:02:11.975301 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.144781 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.144854 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.144884 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.145052 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a7d66633-e694-4e7e-ba21-70dc18b93cfb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.145166 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp8vv\" (UniqueName: \"kubernetes.io/projected/a7d66633-e694-4e7e-ba21-70dc18b93cfb-kube-api-access-xp8vv\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.145200 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7d66633-e694-4e7e-ba21-70dc18b93cfb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.145222 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.145307 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d66633-e694-4e7e-ba21-70dc18b93cfb-config\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249557 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a7d66633-e694-4e7e-ba21-70dc18b93cfb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp8vv\" (UniqueName: \"kubernetes.io/projected/a7d66633-e694-4e7e-ba21-70dc18b93cfb-kube-api-access-xp8vv\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249777 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7d66633-e694-4e7e-ba21-70dc18b93cfb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249831 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d66633-e694-4e7e-ba21-70dc18b93cfb-config\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249853 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249868 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.249899 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.255564 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7d66633-e694-4e7e-ba21-70dc18b93cfb-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.256412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a7d66633-e694-4e7e-ba21-70dc18b93cfb-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.257786 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.258030 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.271097 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.288940 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7d66633-e694-4e7e-ba21-70dc18b93cfb-config\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.292529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7d66633-e694-4e7e-ba21-70dc18b93cfb-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.319817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp8vv\" (UniqueName: \"kubernetes.io/projected/a7d66633-e694-4e7e-ba21-70dc18b93cfb-kube-api-access-xp8vv\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.340390 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a7d66633-e694-4e7e-ba21-70dc18b93cfb\") " pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:12 crc kubenswrapper[4907]: I0226 16:02:12.607454 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.531797 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.533770 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.538281 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.543049 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.543076 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4ppb6" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.543279 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.543892 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.596984 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597069 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597162 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597360 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xnr\" (UniqueName: \"kubernetes.io/projected/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-kube-api-access-p5xnr\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597467 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597496 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.597636 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.699941 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.699989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700087 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700123 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700139 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700182 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xnr\" (UniqueName: \"kubernetes.io/projected/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-kube-api-access-p5xnr\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700344 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700821 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.700832 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.702192 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.707848 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.708668 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.719567 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.724697 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.725224 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xnr\" (UniqueName: \"kubernetes.io/projected/6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2-kube-api-access-p5xnr\") pod \"ovsdbserver-sb-0\" (UID: \"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2\") " pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:14 crc kubenswrapper[4907]: I0226 16:02:14.868517 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:18 crc kubenswrapper[4907]: I0226 16:02:18.530179 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:02:18 crc kubenswrapper[4907]: I0226 16:02:18.530768 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:02:20 crc kubenswrapper[4907]: I0226 16:02:20.495380 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:02:20 crc kubenswrapper[4907]: I0226 16:02:20.624540 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3623ea59-40fb-48f6-943d-1ea5fe3ad253","Type":"ContainerStarted","Data":"6be5f9675dde560a3293f4798ed38ecd5c23502d491fb1ae059567b705653f3c"} Feb 26 16:02:32 crc kubenswrapper[4907]: E0226 16:02:32.446788 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 26 16:02:32 crc kubenswrapper[4907]: E0226 16:02:32.447378 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sk98t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(7d7af39e-1222-4a40-a2f3-a644e2ef477d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:32 crc kubenswrapper[4907]: E0226 16:02:32.448569 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="7d7af39e-1222-4a40-a2f3-a644e2ef477d" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.276683 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="7d7af39e-1222-4a40-a2f3-a644e2ef477d" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.356488 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.356733 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5qs6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(3fdde055-1569-4b2a-bc9f-893b93ee63b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.358806 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="3fdde055-1569-4b2a-bc9f-893b93ee63b1" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.390186 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.390531 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l84pn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-b2427_openstack(5e2f3aaf-955c-42d3-aa43-da773d92499a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.391787 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" podUID="5e2f3aaf-955c-42d3-aa43-da773d92499a" Feb 26 16:02:34 crc kubenswrapper[4907]: E0226 16:02:34.729643 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="3fdde055-1569-4b2a-bc9f-893b93ee63b1" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.426726 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.427269 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98k2n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(96ba881c-449c-4300-b67f-8a1e952af508): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.428579 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="96ba881c-449c-4300-b67f-8a1e952af508" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.439177 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.439435 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k6lgm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(cca4ff23-cabb-466c-80a0-dbcc1f005123): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.441297 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.472951 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.473168 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qg58l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-49q42_openstack(d6a59f65-50b1-41f9-99df-2813f1c439bf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.474362 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" podUID="d6a59f65-50b1-41f9-99df-2813f1c439bf" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.730131 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" Feb 26 16:02:35 crc kubenswrapper[4907]: E0226 16:02:35.732111 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="96ba881c-449c-4300-b67f-8a1e952af508" Feb 26 16:02:36 crc kubenswrapper[4907]: E0226 16:02:36.615122 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Feb 26 16:02:36 crc kubenswrapper[4907]: E0226 16:02:36.615555 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n97hfch656h658h699hf9h6dh574h664h575h587h568hf6h99h696h646h9h98h5bchffh5d6hddh68ch557hc7h646h55fh648h549h54bh567h578q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dwpdk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(964032de-099d-4e22-95d5-d7acf78c5685): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:02:36 crc kubenswrapper[4907]: E0226 16:02:36.617110 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="964032de-099d-4e22-95d5-d7acf78c5685" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.732243 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.745767 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" event={"ID":"d6a59f65-50b1-41f9-99df-2813f1c439bf","Type":"ContainerDied","Data":"e8855212fbe7887e2ace10fbdcf150ff74139fb3805f91e8cee4f0f7a4c5e741"} Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.745813 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8855212fbe7887e2ace10fbdcf150ff74139fb3805f91e8cee4f0f7a4c5e741" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.751853 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.751955 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-b2427" event={"ID":"5e2f3aaf-955c-42d3-aa43-da773d92499a","Type":"ContainerDied","Data":"c864f5ab359ed36cea19b39a5a891ae293124e5e6c5b3fb1719b33f6e2b58504"} Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.753654 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:02:36 crc kubenswrapper[4907]: E0226 16:02:36.754704 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="964032de-099d-4e22-95d5-d7acf78c5685" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.820846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l84pn\" (UniqueName: \"kubernetes.io/projected/5e2f3aaf-955c-42d3-aa43-da773d92499a-kube-api-access-l84pn\") pod \"5e2f3aaf-955c-42d3-aa43-da773d92499a\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.820951 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2f3aaf-955c-42d3-aa43-da773d92499a-config\") pod \"5e2f3aaf-955c-42d3-aa43-da773d92499a\" (UID: \"5e2f3aaf-955c-42d3-aa43-da773d92499a\") " Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.821951 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e2f3aaf-955c-42d3-aa43-da773d92499a-config" (OuterVolumeSpecName: "config") pod "5e2f3aaf-955c-42d3-aa43-da773d92499a" (UID: "5e2f3aaf-955c-42d3-aa43-da773d92499a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.826063 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2f3aaf-955c-42d3-aa43-da773d92499a-kube-api-access-l84pn" (OuterVolumeSpecName: "kube-api-access-l84pn") pod "5e2f3aaf-955c-42d3-aa43-da773d92499a" (UID: "5e2f3aaf-955c-42d3-aa43-da773d92499a"). InnerVolumeSpecName "kube-api-access-l84pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.922509 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-dns-svc\") pod \"d6a59f65-50b1-41f9-99df-2813f1c439bf\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.922671 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-config\") pod \"d6a59f65-50b1-41f9-99df-2813f1c439bf\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.922732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg58l\" (UniqueName: \"kubernetes.io/projected/d6a59f65-50b1-41f9-99df-2813f1c439bf-kube-api-access-qg58l\") pod \"d6a59f65-50b1-41f9-99df-2813f1c439bf\" (UID: \"d6a59f65-50b1-41f9-99df-2813f1c439bf\") " Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.923109 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e2f3aaf-955c-42d3-aa43-da773d92499a-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.923121 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l84pn\" (UniqueName: \"kubernetes.io/projected/5e2f3aaf-955c-42d3-aa43-da773d92499a-kube-api-access-l84pn\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.923205 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6a59f65-50b1-41f9-99df-2813f1c439bf" (UID: "d6a59f65-50b1-41f9-99df-2813f1c439bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.923220 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-config" (OuterVolumeSpecName: "config") pod "d6a59f65-50b1-41f9-99df-2813f1c439bf" (UID: "d6a59f65-50b1-41f9-99df-2813f1c439bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:36 crc kubenswrapper[4907]: I0226 16:02:36.927903 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a59f65-50b1-41f9-99df-2813f1c439bf-kube-api-access-qg58l" (OuterVolumeSpecName: "kube-api-access-qg58l") pod "d6a59f65-50b1-41f9-99df-2813f1c439bf" (UID: "d6a59f65-50b1-41f9-99df-2813f1c439bf"). InnerVolumeSpecName "kube-api-access-qg58l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.025155 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.025198 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg58l\" (UniqueName: \"kubernetes.io/projected/d6a59f65-50b1-41f9-99df-2813f1c439bf-kube-api-access-qg58l\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.025213 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6a59f65-50b1-41f9-99df-2813f1c439bf-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.231434 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-b2427"] Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.239953 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-b2427"] Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.273237 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-drng5"] Feb 26 16:02:37 crc kubenswrapper[4907]: W0226 16:02:37.590767 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66d3c733_f440_4877_9e7b_af62f5dc7857.slice/crio-8bc3b58ee1bd9d3fe29f82ab325aab11b37aa649dcc7c55dabe59a6577ac5137 WatchSource:0}: Error finding container 8bc3b58ee1bd9d3fe29f82ab325aab11b37aa649dcc7c55dabe59a6577ac5137: Status 404 returned error can't find the container with id 8bc3b58ee1bd9d3fe29f82ab325aab11b37aa649dcc7c55dabe59a6577ac5137 Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.787279 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5" event={"ID":"66d3c733-f440-4877-9e7b-af62f5dc7857","Type":"ContainerStarted","Data":"8bc3b58ee1bd9d3fe29f82ab325aab11b37aa649dcc7c55dabe59a6577ac5137"} Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.787306 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-49q42" Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.868491 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49q42"] Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.871979 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-49q42"] Feb 26 16:02:37 crc kubenswrapper[4907]: I0226 16:02:37.957964 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 16:02:38 crc kubenswrapper[4907]: W0226 16:02:38.093606 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c669352_7f94_4a3c_bf6c_a84f7bf2e5e2.slice/crio-461c915e0725ba830dfae0c334ef9236a01941d256000a6c23191956e8227800 WatchSource:0}: Error finding container 461c915e0725ba830dfae0c334ef9236a01941d256000a6c23191956e8227800: Status 404 returned error can't find the container with id 461c915e0725ba830dfae0c334ef9236a01941d256000a6c23191956e8227800 Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.137943 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2f3aaf-955c-42d3-aa43-da773d92499a" path="/var/lib/kubelet/pods/5e2f3aaf-955c-42d3-aa43-da773d92499a/volumes" Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.138797 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a59f65-50b1-41f9-99df-2813f1c439bf" path="/var/lib/kubelet/pods/d6a59f65-50b1-41f9-99df-2813f1c439bf/volumes" Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.515895 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9qr64"] Feb 26 16:02:38 crc kubenswrapper[4907]: W0226 16:02:38.657791 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce0f1161_6251_4318_b364_7db1779f93bd.slice/crio-0e3aad64f5bcba959896aa4f54ab0b72f84c8c1ffa6049f5b36ee3f764688efe WatchSource:0}: Error finding container 0e3aad64f5bcba959896aa4f54ab0b72f84c8c1ffa6049f5b36ee3f764688efe: Status 404 returned error can't find the container with id 0e3aad64f5bcba959896aa4f54ab0b72f84c8c1ffa6049f5b36ee3f764688efe Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.795921 4907 generic.go:334] "Generic (PLEG): container finished" podID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerID="0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e" exitCode=0 Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.795957 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" event={"ID":"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91","Type":"ContainerDied","Data":"0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e"} Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.797251 4907 generic.go:334] "Generic (PLEG): container finished" podID="0d202a81-23b7-45d1-847c-81375db1f908" containerID="1a9cebb07c9a4d60574748280b5f88eb69a7b6737a9238ed122fdc8b41714e3b" exitCode=0 Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.797320 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" event={"ID":"0d202a81-23b7-45d1-847c-81375db1f908","Type":"ContainerDied","Data":"1a9cebb07c9a4d60574748280b5f88eb69a7b6737a9238ed122fdc8b41714e3b"} Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.800380 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2","Type":"ContainerStarted","Data":"461c915e0725ba830dfae0c334ef9236a01941d256000a6c23191956e8227800"} Feb 26 16:02:38 crc kubenswrapper[4907]: I0226 16:02:38.801540 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qr64" event={"ID":"ce0f1161-6251-4318-b364-7db1779f93bd","Type":"ContainerStarted","Data":"0e3aad64f5bcba959896aa4f54ab0b72f84c8c1ffa6049f5b36ee3f764688efe"} Feb 26 16:02:39 crc kubenswrapper[4907]: I0226 16:02:39.081428 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 16:02:39 crc kubenswrapper[4907]: I0226 16:02:39.809872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" event={"ID":"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91","Type":"ContainerStarted","Data":"78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e"} Feb 26 16:02:39 crc kubenswrapper[4907]: I0226 16:02:39.811298 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a7d66633-e694-4e7e-ba21-70dc18b93cfb","Type":"ContainerStarted","Data":"2723a3d019c65f4f115b6847ae2ad3eab8abce689222b1122cee65dd8ff25ede"} Feb 26 16:02:39 crc kubenswrapper[4907]: I0226 16:02:39.813144 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" event={"ID":"0d202a81-23b7-45d1-847c-81375db1f908","Type":"ContainerStarted","Data":"9f74a8d5886f350cbaa6a330fd61893fcdcf39698289689b0c991483b0ea12bd"} Feb 26 16:02:40 crc kubenswrapper[4907]: I0226 16:02:40.820342 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:40 crc kubenswrapper[4907]: I0226 16:02:40.847470 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" podStartSLOduration=6.042768691 podStartE2EDuration="40.847450018s" podCreationTimestamp="2026-02-26 16:02:00 +0000 UTC" firstStartedPulling="2026-02-26 16:02:02.09236895 +0000 UTC m=+1184.610930799" lastFinishedPulling="2026-02-26 16:02:36.897050277 +0000 UTC m=+1219.415612126" observedRunningTime="2026-02-26 16:02:40.84175154 +0000 UTC m=+1223.360313419" watchObservedRunningTime="2026-02-26 16:02:40.847450018 +0000 UTC m=+1223.366011867" Feb 26 16:02:40 crc kubenswrapper[4907]: I0226 16:02:40.865356 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" podStartSLOduration=5.526329136 podStartE2EDuration="39.865335313s" podCreationTimestamp="2026-02-26 16:02:01 +0000 UTC" firstStartedPulling="2026-02-26 16:02:02.289397386 +0000 UTC m=+1184.807959235" lastFinishedPulling="2026-02-26 16:02:36.628403563 +0000 UTC m=+1219.146965412" observedRunningTime="2026-02-26 16:02:40.858445305 +0000 UTC m=+1223.377007154" watchObservedRunningTime="2026-02-26 16:02:40.865335313 +0000 UTC m=+1223.383897162" Feb 26 16:02:41 crc kubenswrapper[4907]: I0226 16:02:41.748345 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.839673 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5" event={"ID":"66d3c733-f440-4877-9e7b-af62f5dc7857","Type":"ContainerStarted","Data":"69975faa7f261de8b4ccee67b5d05d4947c720f5269b1f9a6cc35d51a2317bd0"} Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.840212 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-drng5" Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.842583 4907 generic.go:334] "Generic (PLEG): container finished" podID="ce0f1161-6251-4318-b364-7db1779f93bd" containerID="cc63cc486b78bcd934f652420dde18049652679c4b71a9c9f73ac810527a9c41" exitCode=0 Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.843061 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qr64" event={"ID":"ce0f1161-6251-4318-b364-7db1779f93bd","Type":"ContainerDied","Data":"cc63cc486b78bcd934f652420dde18049652679c4b71a9c9f73ac810527a9c41"} Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.846165 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a7d66633-e694-4e7e-ba21-70dc18b93cfb","Type":"ContainerStarted","Data":"7f7939115f5f53adf07a51737d992e1c16b85491aa2e02539c68d76693ef62ac"} Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.850024 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3623ea59-40fb-48f6-943d-1ea5fe3ad253","Type":"ContainerStarted","Data":"c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef"} Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.850146 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.860560 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2","Type":"ContainerStarted","Data":"5ae5534e7b5caebb5d4cd9343711f509a82ff7d1a176c23ae86e18ef3f56d1ab"} Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.865571 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-drng5" podStartSLOduration=28.590374031 podStartE2EDuration="33.865553916s" podCreationTimestamp="2026-02-26 16:02:10 +0000 UTC" firstStartedPulling="2026-02-26 16:02:37.592721362 +0000 UTC m=+1220.111283211" lastFinishedPulling="2026-02-26 16:02:42.867901247 +0000 UTC m=+1225.386463096" observedRunningTime="2026-02-26 16:02:43.857231735 +0000 UTC m=+1226.375793604" watchObservedRunningTime="2026-02-26 16:02:43.865553916 +0000 UTC m=+1226.384115765" Feb 26 16:02:43 crc kubenswrapper[4907]: I0226 16:02:43.908431 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.762184088 podStartE2EDuration="36.908407797s" podCreationTimestamp="2026-02-26 16:02:07 +0000 UTC" firstStartedPulling="2026-02-26 16:02:20.495026754 +0000 UTC m=+1203.013588613" lastFinishedPulling="2026-02-26 16:02:42.641250473 +0000 UTC m=+1225.159812322" observedRunningTime="2026-02-26 16:02:43.872490964 +0000 UTC m=+1226.391052833" watchObservedRunningTime="2026-02-26 16:02:43.908407797 +0000 UTC m=+1226.426969646" Feb 26 16:02:44 crc kubenswrapper[4907]: I0226 16:02:44.879568 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qr64" event={"ID":"ce0f1161-6251-4318-b364-7db1779f93bd","Type":"ContainerStarted","Data":"8b54674221d974edc3d08a6fbc94aeeaf1b351eaebba3125b9cacb13dd69dc62"} Feb 26 16:02:44 crc kubenswrapper[4907]: I0226 16:02:44.880194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9qr64" event={"ID":"ce0f1161-6251-4318-b364-7db1779f93bd","Type":"ContainerStarted","Data":"6dc3b90f1db572896aa77232fffa4bd4fd263293f9a686cf946d4e0205cfdbe3"} Feb 26 16:02:44 crc kubenswrapper[4907]: I0226 16:02:44.880250 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:44 crc kubenswrapper[4907]: I0226 16:02:44.880278 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:02:44 crc kubenswrapper[4907]: I0226 16:02:44.908478 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9qr64" podStartSLOduration=30.699619087 podStartE2EDuration="34.908457634s" podCreationTimestamp="2026-02-26 16:02:10 +0000 UTC" firstStartedPulling="2026-02-26 16:02:38.659979213 +0000 UTC m=+1221.178541062" lastFinishedPulling="2026-02-26 16:02:42.86881776 +0000 UTC m=+1225.387379609" observedRunningTime="2026-02-26 16:02:44.903062473 +0000 UTC m=+1227.421624322" watchObservedRunningTime="2026-02-26 16:02:44.908457634 +0000 UTC m=+1227.427019483" Feb 26 16:02:45 crc kubenswrapper[4907]: I0226 16:02:45.886961 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a7d66633-e694-4e7e-ba21-70dc18b93cfb","Type":"ContainerStarted","Data":"c86791887f35565f641b1af172b915f62bbbe8d83487afa0ecdde87b7e6a39b2"} Feb 26 16:02:45 crc kubenswrapper[4907]: I0226 16:02:45.889278 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2","Type":"ContainerStarted","Data":"9ff2c932dfcbbad37fe6ea9ce2c3f96952aac82a0711bf7833518958e9cf3765"} Feb 26 16:02:45 crc kubenswrapper[4907]: I0226 16:02:45.908172 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=30.345017135 podStartE2EDuration="35.908153033s" podCreationTimestamp="2026-02-26 16:02:10 +0000 UTC" firstStartedPulling="2026-02-26 16:02:39.502992106 +0000 UTC m=+1222.021553955" lastFinishedPulling="2026-02-26 16:02:45.066128004 +0000 UTC m=+1227.584689853" observedRunningTime="2026-02-26 16:02:45.903329487 +0000 UTC m=+1228.421891336" watchObservedRunningTime="2026-02-26 16:02:45.908153033 +0000 UTC m=+1228.426714882" Feb 26 16:02:45 crc kubenswrapper[4907]: I0226 16:02:45.923342 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=25.962027908 podStartE2EDuration="32.923320962s" podCreationTimestamp="2026-02-26 16:02:13 +0000 UTC" firstStartedPulling="2026-02-26 16:02:38.099746386 +0000 UTC m=+1220.618308235" lastFinishedPulling="2026-02-26 16:02:45.06103944 +0000 UTC m=+1227.579601289" observedRunningTime="2026-02-26 16:02:45.922118493 +0000 UTC m=+1228.440680352" watchObservedRunningTime="2026-02-26 16:02:45.923320962 +0000 UTC m=+1228.441882811" Feb 26 16:02:46 crc kubenswrapper[4907]: I0226 16:02:46.343890 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:46 crc kubenswrapper[4907]: I0226 16:02:46.749766 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:02:46 crc kubenswrapper[4907]: I0226 16:02:46.823058 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-glzzt"] Feb 26 16:02:46 crc kubenswrapper[4907]: I0226 16:02:46.896770 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerName="dnsmasq-dns" containerID="cri-o://78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e" gracePeriod=10 Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.556836 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.608043 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.728811 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-dns-svc\") pod \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.728929 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-config\") pod \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.729027 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6hdx\" (UniqueName: \"kubernetes.io/projected/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-kube-api-access-p6hdx\") pod \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\" (UID: \"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91\") " Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.737419 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-kube-api-access-p6hdx" (OuterVolumeSpecName: "kube-api-access-p6hdx") pod "b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" (UID: "b95ff69f-8d73-415b-8dbb-84ed7e1f3a91"). InnerVolumeSpecName "kube-api-access-p6hdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.768795 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" (UID: "b95ff69f-8d73-415b-8dbb-84ed7e1f3a91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.776336 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-config" (OuterVolumeSpecName: "config") pod "b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" (UID: "b95ff69f-8d73-415b-8dbb-84ed7e1f3a91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.831396 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.831473 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.831485 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6hdx\" (UniqueName: \"kubernetes.io/projected/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91-kube-api-access-p6hdx\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.869506 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.907971 4907 generic.go:334] "Generic (PLEG): container finished" podID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerID="78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e" exitCode=0 Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.908927 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.909539 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" event={"ID":"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91","Type":"ContainerDied","Data":"78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e"} Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.909686 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-glzzt" event={"ID":"b95ff69f-8d73-415b-8dbb-84ed7e1f3a91","Type":"ContainerDied","Data":"311d8973bfb95480ded0481cec6452568e7c60c1a186c233fc6f4b743fb84252"} Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.909709 4907 scope.go:117] "RemoveContainer" containerID="78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.914869 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.938128 4907 scope.go:117] "RemoveContainer" containerID="0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.954316 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-glzzt"] Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.961516 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-glzzt"] Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.962685 4907 scope.go:117] "RemoveContainer" containerID="78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e" Feb 26 16:02:47 crc kubenswrapper[4907]: E0226 16:02:47.963246 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e\": container with ID starting with 78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e not found: ID does not exist" containerID="78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.963295 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e"} err="failed to get container status \"78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e\": rpc error: code = NotFound desc = could not find container \"78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e\": container with ID starting with 78937a2c86a2c79b920ea5f6abecd017f33ba2329f492814890716840a53e66e not found: ID does not exist" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.963323 4907 scope.go:117] "RemoveContainer" containerID="0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e" Feb 26 16:02:47 crc kubenswrapper[4907]: E0226 16:02:47.963895 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e\": container with ID starting with 0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e not found: ID does not exist" containerID="0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e" Feb 26 16:02:47 crc kubenswrapper[4907]: I0226 16:02:47.963934 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e"} err="failed to get container status \"0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e\": rpc error: code = NotFound desc = could not find container \"0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e\": container with ID starting with 0782c1caa3491e81dddf7bc25686e4fb970c2544dfc76bcef42168b9a81c538e not found: ID does not exist" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.139388 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" path="/var/lib/kubelet/pods/b95ff69f-8d73-415b-8dbb-84ed7e1f3a91/volumes" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.238663 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.529972 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.530314 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.530376 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.531391 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2db300a26f9a65971b75abb9b1132aae00d9a358285f4cb580b858c6563b8062"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.531471 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://2db300a26f9a65971b75abb9b1132aae00d9a358285f4cb580b858c6563b8062" gracePeriod=600 Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.608630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.648630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.918213 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="2db300a26f9a65971b75abb9b1132aae00d9a358285f4cb580b858c6563b8062" exitCode=0 Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.918291 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"2db300a26f9a65971b75abb9b1132aae00d9a358285f4cb580b858c6563b8062"} Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.918352 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"39faa61e9e899f01de0dcddf00d83aac761ca87f8fd53bc6d256f2980199847a"} Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.918371 4907 scope.go:117] "RemoveContainer" containerID="9e579d2506f44ad3d5c29d72a7fa0d983bb32b89f28c090014c2276378479cce" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.918547 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.964627 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 26 16:02:48 crc kubenswrapper[4907]: I0226 16:02:48.977050 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.247146 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sfq9z"] Feb 26 16:02:49 crc kubenswrapper[4907]: E0226 16:02:49.247841 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerName="dnsmasq-dns" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.247866 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerName="dnsmasq-dns" Feb 26 16:02:49 crc kubenswrapper[4907]: E0226 16:02:49.247894 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerName="init" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.247903 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerName="init" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.248079 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b95ff69f-8d73-415b-8dbb-84ed7e1f3a91" containerName="dnsmasq-dns" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.248988 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.254844 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.270491 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sfq9z"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.374039 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.374125 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.374158 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-config\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.374330 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sxqg\" (UniqueName: \"kubernetes.io/projected/ff2c92c6-ced4-4d3e-91c3-7745376793eb-kube-api-access-2sxqg\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.449951 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-j84zw"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.451058 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.453038 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.477534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sxqg\" (UniqueName: \"kubernetes.io/projected/ff2c92c6-ced4-4d3e-91c3-7745376793eb-kube-api-access-2sxqg\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.477676 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.477729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.477755 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-config\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.478810 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-config\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.479061 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.479295 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-j84zw"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.479792 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.509356 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sxqg\" (UniqueName: \"kubernetes.io/projected/ff2c92c6-ced4-4d3e-91c3-7745376793eb-kube-api-access-2sxqg\") pod \"dnsmasq-dns-5bf47b49b7-sfq9z\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.568649 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sfq9z"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.569173 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.592168 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/928405a5-2e89-44dd-ab55-8d82ba1db8c3-ovs-rundir\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.592208 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928405a5-2e89-44dd-ab55-8d82ba1db8c3-combined-ca-bundle\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.592231 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/928405a5-2e89-44dd-ab55-8d82ba1db8c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.592248 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/928405a5-2e89-44dd-ab55-8d82ba1db8c3-ovn-rundir\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.592349 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf2rs\" (UniqueName: \"kubernetes.io/projected/928405a5-2e89-44dd-ab55-8d82ba1db8c3-kube-api-access-jf2rs\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.592466 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928405a5-2e89-44dd-ab55-8d82ba1db8c3-config\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.626926 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-8xg6m"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.628127 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.630016 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.657489 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8xg6m"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.678756 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.680544 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694327 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf2rs\" (UniqueName: \"kubernetes.io/projected/928405a5-2e89-44dd-ab55-8d82ba1db8c3-kube-api-access-jf2rs\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knzkk\" (UniqueName: \"kubernetes.io/projected/369d3e0e-75cc-440c-95d3-cfb112914d57-kube-api-access-knzkk\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694411 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-config\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694438 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928405a5-2e89-44dd-ab55-8d82ba1db8c3-config\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694558 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-dns-svc\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694621 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/928405a5-2e89-44dd-ab55-8d82ba1db8c3-ovs-rundir\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694648 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928405a5-2e89-44dd-ab55-8d82ba1db8c3-combined-ca-bundle\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694672 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/928405a5-2e89-44dd-ab55-8d82ba1db8c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/928405a5-2e89-44dd-ab55-8d82ba1db8c3-ovn-rundir\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.694741 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.695370 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/928405a5-2e89-44dd-ab55-8d82ba1db8c3-ovs-rundir\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.696124 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/928405a5-2e89-44dd-ab55-8d82ba1db8c3-config\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.696607 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/928405a5-2e89-44dd-ab55-8d82ba1db8c3-ovn-rundir\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.704791 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.704943 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.705357 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/928405a5-2e89-44dd-ab55-8d82ba1db8c3-combined-ca-bundle\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.705986 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qz66x" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.708048 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/928405a5-2e89-44dd-ab55-8d82ba1db8c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.724943 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.739578 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf2rs\" (UniqueName: \"kubernetes.io/projected/928405a5-2e89-44dd-ab55-8d82ba1db8c3-kube-api-access-jf2rs\") pod \"ovn-controller-metrics-j84zw\" (UID: \"928405a5-2e89-44dd-ab55-8d82ba1db8c3\") " pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.747657 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.769464 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-j84zw" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.796273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knzkk\" (UniqueName: \"kubernetes.io/projected/369d3e0e-75cc-440c-95d3-cfb112914d57-kube-api-access-knzkk\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.796552 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.796775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-config\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.796864 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.796955 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac0b04a-5f93-4033-b52a-46a47b9f3364-config\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797081 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797176 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-dns-svc\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797266 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ac0b04a-5f93-4033-b52a-46a47b9f3364-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ac0b04a-5f93-4033-b52a-46a47b9f3364-scripts\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797451 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797529 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.797691 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm4j4\" (UniqueName: \"kubernetes.io/projected/0ac0b04a-5f93-4033-b52a-46a47b9f3364-kube-api-access-lm4j4\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.798106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-config\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.798205 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-dns-svc\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.799621 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.799804 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.822773 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knzkk\" (UniqueName: \"kubernetes.io/projected/369d3e0e-75cc-440c-95d3-cfb112914d57-kube-api-access-knzkk\") pod \"dnsmasq-dns-8554648995-8xg6m\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900179 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900430 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac0b04a-5f93-4033-b52a-46a47b9f3364-config\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900519 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900562 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ac0b04a-5f93-4033-b52a-46a47b9f3364-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900633 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ac0b04a-5f93-4033-b52a-46a47b9f3364-scripts\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900682 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.900707 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm4j4\" (UniqueName: \"kubernetes.io/projected/0ac0b04a-5f93-4033-b52a-46a47b9f3364-kube-api-access-lm4j4\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.902302 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac0b04a-5f93-4033-b52a-46a47b9f3364-config\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.904573 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0ac0b04a-5f93-4033-b52a-46a47b9f3364-scripts\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.904908 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0ac0b04a-5f93-4033-b52a-46a47b9f3364-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.911071 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.915755 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.917395 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ac0b04a-5f93-4033-b52a-46a47b9f3364-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.926412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm4j4\" (UniqueName: \"kubernetes.io/projected/0ac0b04a-5f93-4033-b52a-46a47b9f3364-kube-api-access-lm4j4\") pod \"ovn-northd-0\" (UID: \"0ac0b04a-5f93-4033-b52a-46a47b9f3364\") " pod="openstack/ovn-northd-0" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.969677 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.970910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96ba881c-449c-4300-b67f-8a1e952af508","Type":"ContainerStarted","Data":"e28a3b8c761243a769d04d190d2ae365641bcbb802321434379486662fc95053"} Feb 26 16:02:49 crc kubenswrapper[4907]: I0226 16:02:49.986768 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cca4ff23-cabb-466c-80a0-dbcc1f005123","Type":"ContainerStarted","Data":"ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77"} Feb 26 16:02:50 crc kubenswrapper[4907]: I0226 16:02:50.163092 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 16:02:50 crc kubenswrapper[4907]: I0226 16:02:50.338747 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sfq9z"] Feb 26 16:02:50 crc kubenswrapper[4907]: W0226 16:02:50.360182 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff2c92c6_ced4_4d3e_91c3_7745376793eb.slice/crio-c9931e44255b693a3640f475f0a84d29643ceb527408aa64997b106e48368701 WatchSource:0}: Error finding container c9931e44255b693a3640f475f0a84d29643ceb527408aa64997b106e48368701: Status 404 returned error can't find the container with id c9931e44255b693a3640f475f0a84d29643ceb527408aa64997b106e48368701 Feb 26 16:02:50 crc kubenswrapper[4907]: W0226 16:02:50.516753 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod928405a5_2e89_44dd_ab55_8d82ba1db8c3.slice/crio-cb76d4651b576e0bfd9624302f806ae70bc635824429232cd6d34afb4ae5a7a9 WatchSource:0}: Error finding container cb76d4651b576e0bfd9624302f806ae70bc635824429232cd6d34afb4ae5a7a9: Status 404 returned error can't find the container with id cb76d4651b576e0bfd9624302f806ae70bc635824429232cd6d34afb4ae5a7a9 Feb 26 16:02:50 crc kubenswrapper[4907]: I0226 16:02:50.518717 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-j84zw"] Feb 26 16:02:50 crc kubenswrapper[4907]: I0226 16:02:50.603087 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8xg6m"] Feb 26 16:02:50 crc kubenswrapper[4907]: W0226 16:02:50.835117 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ac0b04a_5f93_4033_b52a_46a47b9f3364.slice/crio-8601678d049f292f747e469bdd9a896db7a007649e99d789c03acbfe52e38e91 WatchSource:0}: Error finding container 8601678d049f292f747e469bdd9a896db7a007649e99d789c03acbfe52e38e91: Status 404 returned error can't find the container with id 8601678d049f292f747e469bdd9a896db7a007649e99d789c03acbfe52e38e91 Feb 26 16:02:50 crc kubenswrapper[4907]: I0226 16:02:50.837455 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.008011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"964032de-099d-4e22-95d5-d7acf78c5685","Type":"ContainerStarted","Data":"5b76663c5e1b0967380f197483a0a2fa6f9265ac1bc64db416e72d1f7fa33696"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.009521 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.011380 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7d7af39e-1222-4a40-a2f3-a644e2ef477d","Type":"ContainerStarted","Data":"a3d9d679d09f53d20626ac8ebe747b8989b756a7520569ee123470cc6816adb5"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.014358 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ac0b04a-5f93-4033-b52a-46a47b9f3364","Type":"ContainerStarted","Data":"8601678d049f292f747e469bdd9a896db7a007649e99d789c03acbfe52e38e91"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.015933 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-j84zw" event={"ID":"928405a5-2e89-44dd-ab55-8d82ba1db8c3","Type":"ContainerStarted","Data":"a1589f76472bff760e4299ba240e3765fe5ee634e86c1aa11e49c01b80ba6ae0"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.015964 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-j84zw" event={"ID":"928405a5-2e89-44dd-ab55-8d82ba1db8c3","Type":"ContainerStarted","Data":"cb76d4651b576e0bfd9624302f806ae70bc635824429232cd6d34afb4ae5a7a9"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.018875 4907 generic.go:334] "Generic (PLEG): container finished" podID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerID="b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d" exitCode=0 Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.018975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8xg6m" event={"ID":"369d3e0e-75cc-440c-95d3-cfb112914d57","Type":"ContainerDied","Data":"b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.019000 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8xg6m" event={"ID":"369d3e0e-75cc-440c-95d3-cfb112914d57","Type":"ContainerStarted","Data":"647873f6b82381976b4627fc4faf153f1b06362e3aa4ae17244988d096ab8b5d"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.023864 4907 generic.go:334] "Generic (PLEG): container finished" podID="ff2c92c6-ced4-4d3e-91c3-7745376793eb" containerID="5c32ecae00adac0ac70a6e5602da542957365fa9d87cee0992318807313b657f" exitCode=0 Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.023946 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" event={"ID":"ff2c92c6-ced4-4d3e-91c3-7745376793eb","Type":"ContainerDied","Data":"5c32ecae00adac0ac70a6e5602da542957365fa9d87cee0992318807313b657f"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.023978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" event={"ID":"ff2c92c6-ced4-4d3e-91c3-7745376793eb","Type":"ContainerStarted","Data":"c9931e44255b693a3640f475f0a84d29643ceb527408aa64997b106e48368701"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.043500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fdde055-1569-4b2a-bc9f-893b93ee63b1","Type":"ContainerStarted","Data":"45bee135d6ee0fdc4beb334f0edd8b9e0e218bf5cf06539cb3baee61c8fd270b"} Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.061292 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.6976056010000002 podStartE2EDuration="46.061276604s" podCreationTimestamp="2026-02-26 16:02:05 +0000 UTC" firstStartedPulling="2026-02-26 16:02:06.604700598 +0000 UTC m=+1189.123262447" lastFinishedPulling="2026-02-26 16:02:49.968371601 +0000 UTC m=+1232.486933450" observedRunningTime="2026-02-26 16:02:51.05905494 +0000 UTC m=+1233.577616789" watchObservedRunningTime="2026-02-26 16:02:51.061276604 +0000 UTC m=+1233.579838453" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.264576 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-j84zw" podStartSLOduration=2.264551611 podStartE2EDuration="2.264551611s" podCreationTimestamp="2026-02-26 16:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:51.244144525 +0000 UTC m=+1233.762706374" watchObservedRunningTime="2026-02-26 16:02:51.264551611 +0000 UTC m=+1233.783113460" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.606619 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.666617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sxqg\" (UniqueName: \"kubernetes.io/projected/ff2c92c6-ced4-4d3e-91c3-7745376793eb-kube-api-access-2sxqg\") pod \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.666763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-ovsdbserver-nb\") pod \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.666847 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-dns-svc\") pod \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.667128 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-config\") pod \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\" (UID: \"ff2c92c6-ced4-4d3e-91c3-7745376793eb\") " Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.673377 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff2c92c6-ced4-4d3e-91c3-7745376793eb-kube-api-access-2sxqg" (OuterVolumeSpecName: "kube-api-access-2sxqg") pod "ff2c92c6-ced4-4d3e-91c3-7745376793eb" (UID: "ff2c92c6-ced4-4d3e-91c3-7745376793eb"). InnerVolumeSpecName "kube-api-access-2sxqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.686789 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff2c92c6-ced4-4d3e-91c3-7745376793eb" (UID: "ff2c92c6-ced4-4d3e-91c3-7745376793eb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.693275 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff2c92c6-ced4-4d3e-91c3-7745376793eb" (UID: "ff2c92c6-ced4-4d3e-91c3-7745376793eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.697129 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-config" (OuterVolumeSpecName: "config") pod "ff2c92c6-ced4-4d3e-91c3-7745376793eb" (UID: "ff2c92c6-ced4-4d3e-91c3-7745376793eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.768133 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.768174 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.768188 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sxqg\" (UniqueName: \"kubernetes.io/projected/ff2c92c6-ced4-4d3e-91c3-7745376793eb-kube-api-access-2sxqg\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:51 crc kubenswrapper[4907]: I0226 16:02:51.768206 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff2c92c6-ced4-4d3e-91c3-7745376793eb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:02:52 crc kubenswrapper[4907]: I0226 16:02:52.062779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8xg6m" event={"ID":"369d3e0e-75cc-440c-95d3-cfb112914d57","Type":"ContainerStarted","Data":"a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c"} Feb 26 16:02:52 crc kubenswrapper[4907]: I0226 16:02:52.063436 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:52 crc kubenswrapper[4907]: I0226 16:02:52.066709 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" event={"ID":"ff2c92c6-ced4-4d3e-91c3-7745376793eb","Type":"ContainerDied","Data":"c9931e44255b693a3640f475f0a84d29643ceb527408aa64997b106e48368701"} Feb 26 16:02:52 crc kubenswrapper[4907]: I0226 16:02:52.066782 4907 scope.go:117] "RemoveContainer" containerID="5c32ecae00adac0ac70a6e5602da542957365fa9d87cee0992318807313b657f" Feb 26 16:02:52 crc kubenswrapper[4907]: I0226 16:02:52.067501 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:02:52 crc kubenswrapper[4907]: I0226 16:02:52.088411 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-8xg6m" podStartSLOduration=3.088396628 podStartE2EDuration="3.088396628s" podCreationTimestamp="2026-02-26 16:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:02:52.087087387 +0000 UTC m=+1234.605649236" watchObservedRunningTime="2026-02-26 16:02:52.088396628 +0000 UTC m=+1234.606958477" Feb 26 16:02:53 crc kubenswrapper[4907]: I0226 16:02:53.079161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ac0b04a-5f93-4033-b52a-46a47b9f3364","Type":"ContainerStarted","Data":"bf525ccd3fe807090270d4caa5bac5837fd93c531b6bb240d1288616aa9d84d2"} Feb 26 16:02:53 crc kubenswrapper[4907]: I0226 16:02:53.079477 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0ac0b04a-5f93-4033-b52a-46a47b9f3364","Type":"ContainerStarted","Data":"e02724a0c6741a782e9e023d1a82fda287ef43fe56fd5471054a4a650de0c0b5"} Feb 26 16:02:53 crc kubenswrapper[4907]: I0226 16:02:53.079807 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 26 16:02:53 crc kubenswrapper[4907]: I0226 16:02:53.106859 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.934203443 podStartE2EDuration="4.106841423s" podCreationTimestamp="2026-02-26 16:02:49 +0000 UTC" firstStartedPulling="2026-02-26 16:02:50.836448693 +0000 UTC m=+1233.355010552" lastFinishedPulling="2026-02-26 16:02:52.009086683 +0000 UTC m=+1234.527648532" observedRunningTime="2026-02-26 16:02:53.104053426 +0000 UTC m=+1235.622615275" watchObservedRunningTime="2026-02-26 16:02:53.106841423 +0000 UTC m=+1235.625403262" Feb 26 16:02:55 crc kubenswrapper[4907]: I0226 16:02:55.114096 4907 generic.go:334] "Generic (PLEG): container finished" podID="3fdde055-1569-4b2a-bc9f-893b93ee63b1" containerID="45bee135d6ee0fdc4beb334f0edd8b9e0e218bf5cf06539cb3baee61c8fd270b" exitCode=0 Feb 26 16:02:55 crc kubenswrapper[4907]: I0226 16:02:55.114203 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fdde055-1569-4b2a-bc9f-893b93ee63b1","Type":"ContainerDied","Data":"45bee135d6ee0fdc4beb334f0edd8b9e0e218bf5cf06539cb3baee61c8fd270b"} Feb 26 16:02:55 crc kubenswrapper[4907]: I0226 16:02:55.118136 4907 generic.go:334] "Generic (PLEG): container finished" podID="7d7af39e-1222-4a40-a2f3-a644e2ef477d" containerID="a3d9d679d09f53d20626ac8ebe747b8989b756a7520569ee123470cc6816adb5" exitCode=0 Feb 26 16:02:55 crc kubenswrapper[4907]: I0226 16:02:55.118177 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7d7af39e-1222-4a40-a2f3-a644e2ef477d","Type":"ContainerDied","Data":"a3d9d679d09f53d20626ac8ebe747b8989b756a7520569ee123470cc6816adb5"} Feb 26 16:02:55 crc kubenswrapper[4907]: I0226 16:02:55.720683 4907 scope.go:117] "RemoveContainer" containerID="fe9ed9aadb41ed6f130e39ddcdfc955305afcaf1edfdb286a96007db514190d7" Feb 26 16:02:55 crc kubenswrapper[4907]: I0226 16:02:55.960093 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 26 16:02:56 crc kubenswrapper[4907]: I0226 16:02:56.142241 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"7d7af39e-1222-4a40-a2f3-a644e2ef477d","Type":"ContainerStarted","Data":"5adfc1b64f7d651a5018c95a0b332e8df98a340bd3d9d6cc1c6eb7a5bf78a889"} Feb 26 16:02:56 crc kubenswrapper[4907]: I0226 16:02:56.143000 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3fdde055-1569-4b2a-bc9f-893b93ee63b1","Type":"ContainerStarted","Data":"ba1a5a80c9bbeb0b6552009f9fd2850f05b0b4935a69753a6f5cfa63e0f0d56f"} Feb 26 16:02:56 crc kubenswrapper[4907]: I0226 16:02:56.166129 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.434943995 podStartE2EDuration="52.166113282s" podCreationTimestamp="2026-02-26 16:02:04 +0000 UTC" firstStartedPulling="2026-02-26 16:02:06.239671494 +0000 UTC m=+1188.758233343" lastFinishedPulling="2026-02-26 16:02:49.970840781 +0000 UTC m=+1232.489402630" observedRunningTime="2026-02-26 16:02:56.163852686 +0000 UTC m=+1238.682414575" watchObservedRunningTime="2026-02-26 16:02:56.166113282 +0000 UTC m=+1238.684675131" Feb 26 16:02:56 crc kubenswrapper[4907]: I0226 16:02:56.190915 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.318111858 podStartE2EDuration="54.190892943s" podCreationTimestamp="2026-02-26 16:02:02 +0000 UTC" firstStartedPulling="2026-02-26 16:02:04.881072207 +0000 UTC m=+1187.399634056" lastFinishedPulling="2026-02-26 16:02:49.753853292 +0000 UTC m=+1232.272415141" observedRunningTime="2026-02-26 16:02:56.190005282 +0000 UTC m=+1238.708567151" watchObservedRunningTime="2026-02-26 16:02:56.190892943 +0000 UTC m=+1238.709454792" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.299448 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8xg6m"] Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.299988 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-8xg6m" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerName="dnsmasq-dns" containerID="cri-o://a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c" gracePeriod=10 Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.305728 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.333323 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xx2fj"] Feb 26 16:02:58 crc kubenswrapper[4907]: E0226 16:02:58.333648 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff2c92c6-ced4-4d3e-91c3-7745376793eb" containerName="init" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.333662 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff2c92c6-ced4-4d3e-91c3-7745376793eb" containerName="init" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.333837 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff2c92c6-ced4-4d3e-91c3-7745376793eb" containerName="init" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.346022 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.350327 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xx2fj"] Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.473419 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.473840 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-config\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.473942 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbh4\" (UniqueName: \"kubernetes.io/projected/62d9c258-3e92-48cc-a4b2-7207c93a6346-kube-api-access-rjbh4\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.473995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.474032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.574972 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.575330 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.575377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.575420 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-config\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.575495 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbh4\" (UniqueName: \"kubernetes.io/projected/62d9c258-3e92-48cc-a4b2-7207c93a6346-kube-api-access-rjbh4\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.576646 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.577891 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.578366 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-config\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.579115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.617742 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbh4\" (UniqueName: \"kubernetes.io/projected/62d9c258-3e92-48cc-a4b2-7207c93a6346-kube-api-access-rjbh4\") pod \"dnsmasq-dns-b8fbc5445-xx2fj\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.662942 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.864808 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.984695 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-dns-svc\") pod \"369d3e0e-75cc-440c-95d3-cfb112914d57\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.984786 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-sb\") pod \"369d3e0e-75cc-440c-95d3-cfb112914d57\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.984829 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-config\") pod \"369d3e0e-75cc-440c-95d3-cfb112914d57\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.984874 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knzkk\" (UniqueName: \"kubernetes.io/projected/369d3e0e-75cc-440c-95d3-cfb112914d57-kube-api-access-knzkk\") pod \"369d3e0e-75cc-440c-95d3-cfb112914d57\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.984981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-nb\") pod \"369d3e0e-75cc-440c-95d3-cfb112914d57\" (UID: \"369d3e0e-75cc-440c-95d3-cfb112914d57\") " Feb 26 16:02:58 crc kubenswrapper[4907]: I0226 16:02:58.998749 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/369d3e0e-75cc-440c-95d3-cfb112914d57-kube-api-access-knzkk" (OuterVolumeSpecName: "kube-api-access-knzkk") pod "369d3e0e-75cc-440c-95d3-cfb112914d57" (UID: "369d3e0e-75cc-440c-95d3-cfb112914d57"). InnerVolumeSpecName "kube-api-access-knzkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.044045 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-config" (OuterVolumeSpecName: "config") pod "369d3e0e-75cc-440c-95d3-cfb112914d57" (UID: "369d3e0e-75cc-440c-95d3-cfb112914d57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.045516 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "369d3e0e-75cc-440c-95d3-cfb112914d57" (UID: "369d3e0e-75cc-440c-95d3-cfb112914d57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.050699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "369d3e0e-75cc-440c-95d3-cfb112914d57" (UID: "369d3e0e-75cc-440c-95d3-cfb112914d57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.056521 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "369d3e0e-75cc-440c-95d3-cfb112914d57" (UID: "369d3e0e-75cc-440c-95d3-cfb112914d57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.086657 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.086683 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.086694 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.086704 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knzkk\" (UniqueName: \"kubernetes.io/projected/369d3e0e-75cc-440c-95d3-cfb112914d57-kube-api-access-knzkk\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.086712 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/369d3e0e-75cc-440c-95d3-cfb112914d57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.158646 4907 generic.go:334] "Generic (PLEG): container finished" podID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerID="a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c" exitCode=0 Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.158697 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8xg6m" event={"ID":"369d3e0e-75cc-440c-95d3-cfb112914d57","Type":"ContainerDied","Data":"a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c"} Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.158708 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-8xg6m" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.158726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-8xg6m" event={"ID":"369d3e0e-75cc-440c-95d3-cfb112914d57","Type":"ContainerDied","Data":"647873f6b82381976b4627fc4faf153f1b06362e3aa4ae17244988d096ab8b5d"} Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.158748 4907 scope.go:117] "RemoveContainer" containerID="a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.184386 4907 scope.go:117] "RemoveContainer" containerID="b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.204207 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8xg6m"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.214665 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-8xg6m"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.219220 4907 scope.go:117] "RemoveContainer" containerID="a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c" Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.219908 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c\": container with ID starting with a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c not found: ID does not exist" containerID="a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.219938 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c"} err="failed to get container status \"a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c\": rpc error: code = NotFound desc = could not find container \"a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c\": container with ID starting with a0861688274c2e7a673dbcb0fa35657aa3b3f110c69f6b6d180d2326b262cf6c not found: ID does not exist" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.219963 4907 scope.go:117] "RemoveContainer" containerID="b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d" Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.220302 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d\": container with ID starting with b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d not found: ID does not exist" containerID="b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.220347 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d"} err="failed to get container status \"b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d\": rpc error: code = NotFound desc = could not find container \"b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d\": container with ID starting with b401f5b2ca3080c2e6a8df3d4d57922a32551edf50500f89570876c20d6a847d not found: ID does not exist" Feb 26 16:03:00 crc kubenswrapper[4907]: W0226 16:02:59.229229 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62d9c258_3e92_48cc_a4b2_7207c93a6346.slice/crio-a48aed336a0f131903b43291a333c026068552b800492ce4535ef8aee8254245 WatchSource:0}: Error finding container a48aed336a0f131903b43291a333c026068552b800492ce4535ef8aee8254245: Status 404 returned error can't find the container with id a48aed336a0f131903b43291a333c026068552b800492ce4535ef8aee8254245 Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.232799 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xx2fj"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.503977 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.504857 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerName="dnsmasq-dns" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.504874 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerName="dnsmasq-dns" Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.504914 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerName="init" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.504921 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerName="init" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.505276 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" containerName="dnsmasq-dns" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.519501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.524228 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.524538 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.524711 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xh2p4" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.524727 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.562647 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.598953 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mwn\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-kube-api-access-t4mwn\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.599042 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.599100 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-cache\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.599131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-lock\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.599166 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.599219 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.700618 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.700732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-cache\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.700806 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-lock\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.701181 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-cache\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.701221 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-lock\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.701344 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.701411 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.701474 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mwn\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-kube-api-access-t4mwn\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.701599 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.702135 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.702160 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:02:59.702214 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift podName:819c7fec-fd22-478a-bf6c-f4cb5aeccc59 nodeName:}" failed. No retries permitted until 2026-02-26 16:03:00.20219734 +0000 UTC m=+1242.720759179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift") pod "swift-storage-0" (UID: "819c7fec-fd22-478a-bf6c-f4cb5aeccc59") : configmap "swift-ring-files" not found Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.705541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.724894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.726296 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mwn\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-kube-api-access-t4mwn\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.996139 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-ndhsw"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:02:59.998627 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.001364 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.003194 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.005099 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.030504 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ndhsw"] Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:03:00.031192 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-pt8mc ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-pt8mc ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-ndhsw" podUID="95e78aa7-fc29-4bfc-a65e-3f8a738631c7" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.037511 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-zj4xn"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.038420 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.067123 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ndhsw"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.097606 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zj4xn"] Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107692 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-ring-data-devices\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107743 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-swiftconf\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107783 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-ring-data-devices\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107831 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-dispersionconf\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107846 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-swiftconf\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107914 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-combined-ca-bundle\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107935 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-scripts\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.107986 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-scripts\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.108007 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-combined-ca-bundle\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.108046 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-dispersionconf\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.108072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-etc-swift\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.108119 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5f9c74c-c90c-40ba-9548-dc79f90592a4-etc-swift\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.108138 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt8mc\" (UniqueName: \"kubernetes.io/projected/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-kube-api-access-pt8mc\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.108202 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfdt8\" (UniqueName: \"kubernetes.io/projected/c5f9c74c-c90c-40ba-9548-dc79f90592a4-kube-api-access-lfdt8\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.137567 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="369d3e0e-75cc-440c-95d3-cfb112914d57" path="/var/lib/kubelet/pods/369d3e0e-75cc-440c-95d3-cfb112914d57/volumes" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.167454 4907 generic.go:334] "Generic (PLEG): container finished" podID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerID="f2f7a47e0fc218334665a617bdfc44a9c0081ee1fa2a37cc44ccaab1cd2c78b1" exitCode=0 Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.167510 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" event={"ID":"62d9c258-3e92-48cc-a4b2-7207c93a6346","Type":"ContainerDied","Data":"f2f7a47e0fc218334665a617bdfc44a9c0081ee1fa2a37cc44ccaab1cd2c78b1"} Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.167534 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" event={"ID":"62d9c258-3e92-48cc-a4b2-7207c93a6346","Type":"ContainerStarted","Data":"a48aed336a0f131903b43291a333c026068552b800492ce4535ef8aee8254245"} Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.170253 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.204625 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfdt8\" (UniqueName: \"kubernetes.io/projected/c5f9c74c-c90c-40ba-9548-dc79f90592a4-kube-api-access-lfdt8\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209336 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-ring-data-devices\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209386 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-swiftconf\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209443 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-ring-data-devices\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209477 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-dispersionconf\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-swiftconf\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209513 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209538 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-combined-ca-bundle\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209571 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-scripts\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209635 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-scripts\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-combined-ca-bundle\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209670 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-dispersionconf\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209703 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-etc-swift\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209739 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5f9c74c-c90c-40ba-9548-dc79f90592a4-etc-swift\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.209768 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt8mc\" (UniqueName: \"kubernetes.io/projected/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-kube-api-access-pt8mc\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.210841 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-scripts\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.211320 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-etc-swift\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.211477 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-ring-data-devices\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.211954 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5f9c74c-c90c-40ba-9548-dc79f90592a4-etc-swift\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.214890 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-scripts\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.218686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-combined-ca-bundle\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.220874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-dispersionconf\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.223684 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-ring-data-devices\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:03:00.224610 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:03:00.224630 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:03:00 crc kubenswrapper[4907]: E0226 16:03:00.224666 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift podName:819c7fec-fd22-478a-bf6c-f4cb5aeccc59 nodeName:}" failed. No retries permitted until 2026-02-26 16:03:01.224652218 +0000 UTC m=+1243.743214067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift") pod "swift-storage-0" (UID: "819c7fec-fd22-478a-bf6c-f4cb5aeccc59") : configmap "swift-ring-files" not found Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.229901 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfdt8\" (UniqueName: \"kubernetes.io/projected/c5f9c74c-c90c-40ba-9548-dc79f90592a4-kube-api-access-lfdt8\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.230850 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-swiftconf\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.231715 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt8mc\" (UniqueName: \"kubernetes.io/projected/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-kube-api-access-pt8mc\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.251258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-swiftconf\") pod \"swift-ring-rebalance-zj4xn\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.252031 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-dispersionconf\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.252907 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-combined-ca-bundle\") pod \"swift-ring-rebalance-ndhsw\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.338880 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-swiftconf\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.338936 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-ring-data-devices\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.338987 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-combined-ca-bundle\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.339010 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt8mc\" (UniqueName: \"kubernetes.io/projected/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-kube-api-access-pt8mc\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.339052 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-etc-swift\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.339089 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-dispersionconf\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.339235 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-scripts\") pod \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\" (UID: \"95e78aa7-fc29-4bfc-a65e-3f8a738631c7\") " Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.340754 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.341290 4907 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.342004 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-scripts" (OuterVolumeSpecName: "scripts") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.345795 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.346123 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.347431 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.348707 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-kube-api-access-pt8mc" (OuterVolumeSpecName: "kube-api-access-pt8mc") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "kube-api-access-pt8mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.349778 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "95e78aa7-fc29-4bfc-a65e-3f8a738631c7" (UID: "95e78aa7-fc29-4bfc-a65e-3f8a738631c7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.375444 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.442660 4907 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.442954 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.442974 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt8mc\" (UniqueName: \"kubernetes.io/projected/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-kube-api-access-pt8mc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.442992 4907 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.443011 4907 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.443030 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95e78aa7-fc29-4bfc-a65e-3f8a738631c7-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:00 crc kubenswrapper[4907]: I0226 16:03:00.842523 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-zj4xn"] Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.179779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" event={"ID":"62d9c258-3e92-48cc-a4b2-7207c93a6346","Type":"ContainerStarted","Data":"c7b7197fc16c5531ccf4f45093fd0c8f8d3d99749cd680b025e8890044887cce"} Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.179947 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.180935 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-ndhsw" Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.180940 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zj4xn" event={"ID":"c5f9c74c-c90c-40ba-9548-dc79f90592a4","Type":"ContainerStarted","Data":"22e11871b81b84490819a0fa5a32ddce14c902572365fe8cce2c4b5a9204aead"} Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.206144 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podStartSLOduration=3.206119234 podStartE2EDuration="3.206119234s" podCreationTimestamp="2026-02-26 16:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:01.204153087 +0000 UTC m=+1243.722714946" watchObservedRunningTime="2026-02-26 16:03:01.206119234 +0000 UTC m=+1243.724681083" Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.249007 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-ndhsw"] Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.263696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:01 crc kubenswrapper[4907]: E0226 16:03:01.264205 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:03:01 crc kubenswrapper[4907]: E0226 16:03:01.264237 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:03:01 crc kubenswrapper[4907]: E0226 16:03:01.264282 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift podName:819c7fec-fd22-478a-bf6c-f4cb5aeccc59 nodeName:}" failed. No retries permitted until 2026-02-26 16:03:03.264264007 +0000 UTC m=+1245.782825946 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift") pod "swift-storage-0" (UID: "819c7fec-fd22-478a-bf6c-f4cb5aeccc59") : configmap "swift-ring-files" not found Feb 26 16:03:01 crc kubenswrapper[4907]: I0226 16:03:01.272159 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-ndhsw"] Feb 26 16:03:02 crc kubenswrapper[4907]: I0226 16:03:02.137626 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95e78aa7-fc29-4bfc-a65e-3f8a738631c7" path="/var/lib/kubelet/pods/95e78aa7-fc29-4bfc-a65e-3f8a738631c7/volumes" Feb 26 16:03:03 crc kubenswrapper[4907]: I0226 16:03:03.305265 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:03 crc kubenswrapper[4907]: E0226 16:03:03.305465 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:03:03 crc kubenswrapper[4907]: E0226 16:03:03.305486 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:03:03 crc kubenswrapper[4907]: E0226 16:03:03.305537 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift podName:819c7fec-fd22-478a-bf6c-f4cb5aeccc59 nodeName:}" failed. No retries permitted until 2026-02-26 16:03:07.305519071 +0000 UTC m=+1249.824080920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift") pod "swift-storage-0" (UID: "819c7fec-fd22-478a-bf6c-f4cb5aeccc59") : configmap "swift-ring-files" not found Feb 26 16:03:04 crc kubenswrapper[4907]: I0226 16:03:04.094197 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 26 16:03:04 crc kubenswrapper[4907]: I0226 16:03:04.095078 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 26 16:03:04 crc kubenswrapper[4907]: I0226 16:03:04.186815 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 26 16:03:04 crc kubenswrapper[4907]: I0226 16:03:04.281774 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 26 16:03:05 crc kubenswrapper[4907]: I0226 16:03:05.486011 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 26 16:03:05 crc kubenswrapper[4907]: I0226 16:03:05.486103 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.411040 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8plbn"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.413061 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.432201 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8plbn"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.464844 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xb6t\" (UniqueName: \"kubernetes.io/projected/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-kube-api-access-6xb6t\") pod \"glance-db-create-8plbn\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.465139 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-operator-scripts\") pod \"glance-db-create-8plbn\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.487749 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0813-account-create-update-rz8gs"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.488638 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.490806 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.527924 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0813-account-create-update-rz8gs"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.566268 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xb6t\" (UniqueName: \"kubernetes.io/projected/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-kube-api-access-6xb6t\") pod \"glance-db-create-8plbn\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.566489 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-operator-scripts\") pod \"glance-0813-account-create-update-rz8gs\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.566651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bcpr\" (UniqueName: \"kubernetes.io/projected/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-kube-api-access-2bcpr\") pod \"glance-0813-account-create-update-rz8gs\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.566732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-operator-scripts\") pod \"glance-db-create-8plbn\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.567400 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-operator-scripts\") pod \"glance-db-create-8plbn\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.613209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xb6t\" (UniqueName: \"kubernetes.io/projected/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-kube-api-access-6xb6t\") pod \"glance-db-create-8plbn\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.667977 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bcpr\" (UniqueName: \"kubernetes.io/projected/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-kube-api-access-2bcpr\") pod \"glance-0813-account-create-update-rz8gs\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.668142 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-operator-scripts\") pod \"glance-0813-account-create-update-rz8gs\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.670853 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-operator-scripts\") pod \"glance-0813-account-create-update-rz8gs\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.687748 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bcpr\" (UniqueName: \"kubernetes.io/projected/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-kube-api-access-2bcpr\") pod \"glance-0813-account-create-update-rz8gs\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.742344 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8plbn" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.797545 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8hgqp"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.798865 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.805082 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.805279 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8hgqp"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.872514 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wp2\" (UniqueName: \"kubernetes.io/projected/3ad824a7-419c-443b-8278-a4e806370720-kube-api-access-t5wp2\") pod \"keystone-db-create-8hgqp\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.872580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad824a7-419c-443b-8278-a4e806370720-operator-scripts\") pod \"keystone-db-create-8hgqp\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.902542 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f375-account-create-update-rgltk"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.903470 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.916166 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f375-account-create-update-rgltk"] Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.916339 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.975026 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841c55f4-98a9-44dd-bfc7-018ad4a44528-operator-scripts\") pod \"keystone-f375-account-create-update-rgltk\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.975143 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w68qb\" (UniqueName: \"kubernetes.io/projected/841c55f4-98a9-44dd-bfc7-018ad4a44528-kube-api-access-w68qb\") pod \"keystone-f375-account-create-update-rgltk\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.975193 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wp2\" (UniqueName: \"kubernetes.io/projected/3ad824a7-419c-443b-8278-a4e806370720-kube-api-access-t5wp2\") pod \"keystone-db-create-8hgqp\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.975282 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad824a7-419c-443b-8278-a4e806370720-operator-scripts\") pod \"keystone-db-create-8hgqp\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.976082 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad824a7-419c-443b-8278-a4e806370720-operator-scripts\") pod \"keystone-db-create-8hgqp\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:06 crc kubenswrapper[4907]: I0226 16:03:06.996102 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wp2\" (UniqueName: \"kubernetes.io/projected/3ad824a7-419c-443b-8278-a4e806370720-kube-api-access-t5wp2\") pod \"keystone-db-create-8hgqp\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.028196 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4cbgw"] Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.029347 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.046264 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4cbgw"] Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.076404 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841c55f4-98a9-44dd-bfc7-018ad4a44528-operator-scripts\") pod \"keystone-f375-account-create-update-rgltk\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.076483 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w68qb\" (UniqueName: \"kubernetes.io/projected/841c55f4-98a9-44dd-bfc7-018ad4a44528-kube-api-access-w68qb\") pod \"keystone-f375-account-create-update-rgltk\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.076539 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df80d3d-7d86-44dd-a35a-3e9d9d435435-operator-scripts\") pod \"placement-db-create-4cbgw\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.076578 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g72pw\" (UniqueName: \"kubernetes.io/projected/3df80d3d-7d86-44dd-a35a-3e9d9d435435-kube-api-access-g72pw\") pod \"placement-db-create-4cbgw\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.077037 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841c55f4-98a9-44dd-bfc7-018ad4a44528-operator-scripts\") pod \"keystone-f375-account-create-update-rgltk\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.109152 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w68qb\" (UniqueName: \"kubernetes.io/projected/841c55f4-98a9-44dd-bfc7-018ad4a44528-kube-api-access-w68qb\") pod \"keystone-f375-account-create-update-rgltk\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.110389 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d824-account-create-update-x9gtq"] Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.114730 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.117336 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.120300 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d824-account-create-update-x9gtq"] Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.121038 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.178175 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3204495d-0bdb-45bd-b2df-af20221366fd-operator-scripts\") pod \"placement-d824-account-create-update-x9gtq\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.178408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjnh\" (UniqueName: \"kubernetes.io/projected/3204495d-0bdb-45bd-b2df-af20221366fd-kube-api-access-6bjnh\") pod \"placement-d824-account-create-update-x9gtq\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.178654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df80d3d-7d86-44dd-a35a-3e9d9d435435-operator-scripts\") pod \"placement-db-create-4cbgw\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.178763 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g72pw\" (UniqueName: \"kubernetes.io/projected/3df80d3d-7d86-44dd-a35a-3e9d9d435435-kube-api-access-g72pw\") pod \"placement-db-create-4cbgw\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.181383 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df80d3d-7d86-44dd-a35a-3e9d9d435435-operator-scripts\") pod \"placement-db-create-4cbgw\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.198273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g72pw\" (UniqueName: \"kubernetes.io/projected/3df80d3d-7d86-44dd-a35a-3e9d9d435435-kube-api-access-g72pw\") pod \"placement-db-create-4cbgw\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.218826 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.280652 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3204495d-0bdb-45bd-b2df-af20221366fd-operator-scripts\") pod \"placement-d824-account-create-update-x9gtq\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.280702 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjnh\" (UniqueName: \"kubernetes.io/projected/3204495d-0bdb-45bd-b2df-af20221366fd-kube-api-access-6bjnh\") pod \"placement-d824-account-create-update-x9gtq\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.281343 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3204495d-0bdb-45bd-b2df-af20221366fd-operator-scripts\") pod \"placement-d824-account-create-update-x9gtq\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.295478 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjnh\" (UniqueName: \"kubernetes.io/projected/3204495d-0bdb-45bd-b2df-af20221366fd-kube-api-access-6bjnh\") pod \"placement-d824-account-create-update-x9gtq\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.349860 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.382465 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:07 crc kubenswrapper[4907]: E0226 16:03:07.382676 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:03:07 crc kubenswrapper[4907]: E0226 16:03:07.382716 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:03:07 crc kubenswrapper[4907]: E0226 16:03:07.382782 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift podName:819c7fec-fd22-478a-bf6c-f4cb5aeccc59 nodeName:}" failed. No retries permitted until 2026-02-26 16:03:15.382760272 +0000 UTC m=+1257.901322121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift") pod "swift-storage-0" (UID: "819c7fec-fd22-478a-bf6c-f4cb5aeccc59") : configmap "swift-ring-files" not found Feb 26 16:03:07 crc kubenswrapper[4907]: I0226 16:03:07.456839 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:08 crc kubenswrapper[4907]: I0226 16:03:08.665810 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:03:08 crc kubenswrapper[4907]: I0226 16:03:08.749837 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4pkdt"] Feb 26 16:03:08 crc kubenswrapper[4907]: I0226 16:03:08.750288 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" podUID="0d202a81-23b7-45d1-847c-81375db1f908" containerName="dnsmasq-dns" containerID="cri-o://9f74a8d5886f350cbaa6a330fd61893fcdcf39698289689b0c991483b0ea12bd" gracePeriod=10 Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.263408 4907 generic.go:334] "Generic (PLEG): container finished" podID="0d202a81-23b7-45d1-847c-81375db1f908" containerID="9f74a8d5886f350cbaa6a330fd61893fcdcf39698289689b0c991483b0ea12bd" exitCode=0 Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.263758 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" event={"ID":"0d202a81-23b7-45d1-847c-81375db1f908","Type":"ContainerDied","Data":"9f74a8d5886f350cbaa6a330fd61893fcdcf39698289689b0c991483b0ea12bd"} Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.286118 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4cbgw"] Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.307775 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f375-account-create-update-rgltk"] Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.416835 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0813-account-create-update-rz8gs"] Feb 26 16:03:09 crc kubenswrapper[4907]: W0226 16:03:09.423334 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad824a7_419c_443b_8278_a4e806370720.slice/crio-d3e71850bba21abd205c7803f5d5f7eb20a7eeec77604ecfda8c9882fb8b2b9c WatchSource:0}: Error finding container d3e71850bba21abd205c7803f5d5f7eb20a7eeec77604ecfda8c9882fb8b2b9c: Status 404 returned error can't find the container with id d3e71850bba21abd205c7803f5d5f7eb20a7eeec77604ecfda8c9882fb8b2b9c Feb 26 16:03:09 crc kubenswrapper[4907]: W0226 16:03:09.423525 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8aea8ea2_97bd_4315_9335_8fbe73ab8ec2.slice/crio-862f24c33984768a364fa27989f3124d0eb6356670b144e8f3b058c93b66d84e WatchSource:0}: Error finding container 862f24c33984768a364fa27989f3124d0eb6356670b144e8f3b058c93b66d84e: Status 404 returned error can't find the container with id 862f24c33984768a364fa27989f3124d0eb6356670b144e8f3b058c93b66d84e Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.425639 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8hgqp"] Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.522571 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8plbn"] Feb 26 16:03:09 crc kubenswrapper[4907]: W0226 16:03:09.541273 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d62e78_4aa2_4ae7_84dd_99e58e0deb68.slice/crio-e70b9a42f673a44d8f2f5d1ce960c3e3fc912b91b08916f74bd5df8896b749ca WatchSource:0}: Error finding container e70b9a42f673a44d8f2f5d1ce960c3e3fc912b91b08916f74bd5df8896b749ca: Status 404 returned error can't find the container with id e70b9a42f673a44d8f2f5d1ce960c3e3fc912b91b08916f74bd5df8896b749ca Feb 26 16:03:09 crc kubenswrapper[4907]: I0226 16:03:09.702795 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d824-account-create-update-x9gtq"] Feb 26 16:03:09 crc kubenswrapper[4907]: W0226 16:03:09.706733 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3204495d_0bdb_45bd_b2df_af20221366fd.slice/crio-0985c406710137650fd2fa88f373e33133d88bc98620eb6b157686c4b2388621 WatchSource:0}: Error finding container 0985c406710137650fd2fa88f373e33133d88bc98620eb6b157686c4b2388621: Status 404 returned error can't find the container with id 0985c406710137650fd2fa88f373e33133d88bc98620eb6b157686c4b2388621 Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.266235 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.278723 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8hgqp" event={"ID":"3ad824a7-419c-443b-8278-a4e806370720","Type":"ContainerStarted","Data":"7f20ff2fec931032c61472fe98aee531571286b0527d9e808df5612acce2a74f"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.278778 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8hgqp" event={"ID":"3ad824a7-419c-443b-8278-a4e806370720","Type":"ContainerStarted","Data":"d3e71850bba21abd205c7803f5d5f7eb20a7eeec77604ecfda8c9882fb8b2b9c"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.282262 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cbgw" event={"ID":"3df80d3d-7d86-44dd-a35a-3e9d9d435435","Type":"ContainerStarted","Data":"e27511ed608bc4ca90dd647d2b05bd19aa57020996a4b02c6fff9a84c4b76f67"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.282305 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cbgw" event={"ID":"3df80d3d-7d86-44dd-a35a-3e9d9d435435","Type":"ContainerStarted","Data":"14a34563357e04f0033ddc22eca8c76eb501b92c36303d695e1f6573f1eedb62"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.324428 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8plbn" event={"ID":"56d62e78-4aa2-4ae7-84dd-99e58e0deb68","Type":"ContainerStarted","Data":"a1a1dd5695278af93e861a3ea7588d623b20b9e11cf6f02ef672ebd6fca0c916"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.324500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8plbn" event={"ID":"56d62e78-4aa2-4ae7-84dd-99e58e0deb68","Type":"ContainerStarted","Data":"e70b9a42f673a44d8f2f5d1ce960c3e3fc912b91b08916f74bd5df8896b749ca"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.329471 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d824-account-create-update-x9gtq" event={"ID":"3204495d-0bdb-45bd-b2df-af20221366fd","Type":"ContainerStarted","Data":"27a7050b18821bdebe8a05d687a319a520957b7cf8557a85bcfcf81d86e840f9"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.329520 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d824-account-create-update-x9gtq" event={"ID":"3204495d-0bdb-45bd-b2df-af20221366fd","Type":"ContainerStarted","Data":"0985c406710137650fd2fa88f373e33133d88bc98620eb6b157686c4b2388621"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.336644 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f375-account-create-update-rgltk" event={"ID":"841c55f4-98a9-44dd-bfc7-018ad4a44528","Type":"ContainerStarted","Data":"1f63ad42f27882a910cb8d3eb08bdc45289834d31a44f751e36a0e633437d378"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.336690 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f375-account-create-update-rgltk" event={"ID":"841c55f4-98a9-44dd-bfc7-018ad4a44528","Type":"ContainerStarted","Data":"fd3cfb6a8f1f30d04fcfbe8807352f9a671e710ce34a88b2726b723c3d00ba92"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.341442 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0813-account-create-update-rz8gs" event={"ID":"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2","Type":"ContainerStarted","Data":"babb49a6c58bcde03d6ff5c34e9363d88091d1b64393f27bd898d3a62c24de05"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.341484 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0813-account-create-update-rz8gs" event={"ID":"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2","Type":"ContainerStarted","Data":"862f24c33984768a364fa27989f3124d0eb6356670b144e8f3b058c93b66d84e"} Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.354460 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-4cbgw" podStartSLOduration=3.354433373 podStartE2EDuration="3.354433373s" podCreationTimestamp="2026-02-26 16:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:10.319540575 +0000 UTC m=+1252.838102424" watchObservedRunningTime="2026-02-26 16:03:10.354433373 +0000 UTC m=+1252.872995222" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.372103 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-8hgqp" podStartSLOduration=4.372086552 podStartE2EDuration="4.372086552s" podCreationTimestamp="2026-02-26 16:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:10.344100642 +0000 UTC m=+1252.862662501" watchObservedRunningTime="2026-02-26 16:03:10.372086552 +0000 UTC m=+1252.890648401" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.375782 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d824-account-create-update-x9gtq" podStartSLOduration=3.37576344 podStartE2EDuration="3.37576344s" podCreationTimestamp="2026-02-26 16:03:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:10.363217946 +0000 UTC m=+1252.881779795" watchObservedRunningTime="2026-02-26 16:03:10.37576344 +0000 UTC m=+1252.894325319" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.395436 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-8plbn" podStartSLOduration=4.395397897 podStartE2EDuration="4.395397897s" podCreationTimestamp="2026-02-26 16:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:10.377609066 +0000 UTC m=+1252.896170915" watchObservedRunningTime="2026-02-26 16:03:10.395397897 +0000 UTC m=+1252.913959746" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.398941 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.433128 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-0813-account-create-update-rz8gs" podStartSLOduration=4.433107533 podStartE2EDuration="4.433107533s" podCreationTimestamp="2026-02-26 16:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:10.393964302 +0000 UTC m=+1252.912526151" watchObservedRunningTime="2026-02-26 16:03:10.433107533 +0000 UTC m=+1252.951669382" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.448881 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f375-account-create-update-rgltk" podStartSLOduration=4.448865417 podStartE2EDuration="4.448865417s" podCreationTimestamp="2026-02-26 16:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:10.413274772 +0000 UTC m=+1252.931836621" watchObservedRunningTime="2026-02-26 16:03:10.448865417 +0000 UTC m=+1252.967427266" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.544045 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.601488 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.648508 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-dns-svc\") pod \"0d202a81-23b7-45d1-847c-81375db1f908\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.648680 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-config\") pod \"0d202a81-23b7-45d1-847c-81375db1f908\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.648760 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bjjt\" (UniqueName: \"kubernetes.io/projected/0d202a81-23b7-45d1-847c-81375db1f908-kube-api-access-7bjjt\") pod \"0d202a81-23b7-45d1-847c-81375db1f908\" (UID: \"0d202a81-23b7-45d1-847c-81375db1f908\") " Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.684797 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d202a81-23b7-45d1-847c-81375db1f908-kube-api-access-7bjjt" (OuterVolumeSpecName: "kube-api-access-7bjjt") pod "0d202a81-23b7-45d1-847c-81375db1f908" (UID: "0d202a81-23b7-45d1-847c-81375db1f908"). InnerVolumeSpecName "kube-api-access-7bjjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.760853 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bjjt\" (UniqueName: \"kubernetes.io/projected/0d202a81-23b7-45d1-847c-81375db1f908-kube-api-access-7bjjt\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.807834 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-config" (OuterVolumeSpecName: "config") pod "0d202a81-23b7-45d1-847c-81375db1f908" (UID: "0d202a81-23b7-45d1-847c-81375db1f908"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.854861 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d202a81-23b7-45d1-847c-81375db1f908" (UID: "0d202a81-23b7-45d1-847c-81375db1f908"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.862523 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:10 crc kubenswrapper[4907]: I0226 16:03:10.862563 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d202a81-23b7-45d1-847c-81375db1f908-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.351877 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" event={"ID":"0d202a81-23b7-45d1-847c-81375db1f908","Type":"ContainerDied","Data":"d2686a32fcdc47bdfa958b34ce2024f28a32c81ef797c262f375e32a1a24b1eb"} Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.352403 4907 scope.go:117] "RemoveContainer" containerID="9f74a8d5886f350cbaa6a330fd61893fcdcf39698289689b0c991483b0ea12bd" Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.352105 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-4pkdt" Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.354328 4907 generic.go:334] "Generic (PLEG): container finished" podID="56d62e78-4aa2-4ae7-84dd-99e58e0deb68" containerID="a1a1dd5695278af93e861a3ea7588d623b20b9e11cf6f02ef672ebd6fca0c916" exitCode=0 Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.354379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8plbn" event={"ID":"56d62e78-4aa2-4ae7-84dd-99e58e0deb68","Type":"ContainerDied","Data":"a1a1dd5695278af93e861a3ea7588d623b20b9e11cf6f02ef672ebd6fca0c916"} Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.366635 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zj4xn" event={"ID":"c5f9c74c-c90c-40ba-9548-dc79f90592a4","Type":"ContainerStarted","Data":"5b6065f7d0a91b56a02d3ad85d8cc3fd84864465349926f69fad86a45ff01f07"} Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.378125 4907 scope.go:117] "RemoveContainer" containerID="1a9cebb07c9a4d60574748280b5f88eb69a7b6737a9238ed122fdc8b41714e3b" Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.434106 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-zj4xn" podStartSLOduration=1.815520434 podStartE2EDuration="11.434084223s" podCreationTimestamp="2026-02-26 16:03:00 +0000 UTC" firstStartedPulling="2026-02-26 16:03:00.855688174 +0000 UTC m=+1243.374250023" lastFinishedPulling="2026-02-26 16:03:10.474251963 +0000 UTC m=+1252.992813812" observedRunningTime="2026-02-26 16:03:11.42652319 +0000 UTC m=+1253.945085069" watchObservedRunningTime="2026-02-26 16:03:11.434084223 +0000 UTC m=+1253.952646072" Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.463938 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4pkdt"] Feb 26 16:03:11 crc kubenswrapper[4907]: I0226 16:03:11.471350 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-4pkdt"] Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.136984 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d202a81-23b7-45d1-847c-81375db1f908" path="/var/lib/kubelet/pods/0d202a81-23b7-45d1-847c-81375db1f908/volumes" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.372021 4907 generic.go:334] "Generic (PLEG): container finished" podID="3204495d-0bdb-45bd-b2df-af20221366fd" containerID="27a7050b18821bdebe8a05d687a319a520957b7cf8557a85bcfcf81d86e840f9" exitCode=0 Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.372079 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d824-account-create-update-x9gtq" event={"ID":"3204495d-0bdb-45bd-b2df-af20221366fd","Type":"ContainerDied","Data":"27a7050b18821bdebe8a05d687a319a520957b7cf8557a85bcfcf81d86e840f9"} Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.374476 4907 generic.go:334] "Generic (PLEG): container finished" podID="841c55f4-98a9-44dd-bfc7-018ad4a44528" containerID="1f63ad42f27882a910cb8d3eb08bdc45289834d31a44f751e36a0e633437d378" exitCode=0 Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.374524 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f375-account-create-update-rgltk" event={"ID":"841c55f4-98a9-44dd-bfc7-018ad4a44528","Type":"ContainerDied","Data":"1f63ad42f27882a910cb8d3eb08bdc45289834d31a44f751e36a0e633437d378"} Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.375963 4907 generic.go:334] "Generic (PLEG): container finished" podID="8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" containerID="babb49a6c58bcde03d6ff5c34e9363d88091d1b64393f27bd898d3a62c24de05" exitCode=0 Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.376005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0813-account-create-update-rz8gs" event={"ID":"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2","Type":"ContainerDied","Data":"babb49a6c58bcde03d6ff5c34e9363d88091d1b64393f27bd898d3a62c24de05"} Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.380447 4907 generic.go:334] "Generic (PLEG): container finished" podID="3ad824a7-419c-443b-8278-a4e806370720" containerID="7f20ff2fec931032c61472fe98aee531571286b0527d9e808df5612acce2a74f" exitCode=0 Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.380515 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8hgqp" event={"ID":"3ad824a7-419c-443b-8278-a4e806370720","Type":"ContainerDied","Data":"7f20ff2fec931032c61472fe98aee531571286b0527d9e808df5612acce2a74f"} Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.386898 4907 generic.go:334] "Generic (PLEG): container finished" podID="3df80d3d-7d86-44dd-a35a-3e9d9d435435" containerID="e27511ed608bc4ca90dd647d2b05bd19aa57020996a4b02c6fff9a84c4b76f67" exitCode=0 Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.387838 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cbgw" event={"ID":"3df80d3d-7d86-44dd-a35a-3e9d9d435435","Type":"ContainerDied","Data":"e27511ed608bc4ca90dd647d2b05bd19aa57020996a4b02c6fff9a84c4b76f67"} Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.717864 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-r9td4"] Feb 26 16:03:12 crc kubenswrapper[4907]: E0226 16:03:12.718468 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d202a81-23b7-45d1-847c-81375db1f908" containerName="init" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.718485 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d202a81-23b7-45d1-847c-81375db1f908" containerName="init" Feb 26 16:03:12 crc kubenswrapper[4907]: E0226 16:03:12.718509 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d202a81-23b7-45d1-847c-81375db1f908" containerName="dnsmasq-dns" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.718515 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d202a81-23b7-45d1-847c-81375db1f908" containerName="dnsmasq-dns" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.718686 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d202a81-23b7-45d1-847c-81375db1f908" containerName="dnsmasq-dns" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.719180 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.721085 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.729064 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r9td4"] Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.748761 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8plbn" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.801707 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xb6t\" (UniqueName: \"kubernetes.io/projected/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-kube-api-access-6xb6t\") pod \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.801836 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-operator-scripts\") pod \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\" (UID: \"56d62e78-4aa2-4ae7-84dd-99e58e0deb68\") " Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.802150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2569603f-9b29-4342-a289-8484025a2250-operator-scripts\") pod \"root-account-create-update-r9td4\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.802220 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz869\" (UniqueName: \"kubernetes.io/projected/2569603f-9b29-4342-a289-8484025a2250-kube-api-access-tz869\") pod \"root-account-create-update-r9td4\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.802322 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56d62e78-4aa2-4ae7-84dd-99e58e0deb68" (UID: "56d62e78-4aa2-4ae7-84dd-99e58e0deb68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.820863 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-kube-api-access-6xb6t" (OuterVolumeSpecName: "kube-api-access-6xb6t") pod "56d62e78-4aa2-4ae7-84dd-99e58e0deb68" (UID: "56d62e78-4aa2-4ae7-84dd-99e58e0deb68"). InnerVolumeSpecName "kube-api-access-6xb6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.904227 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2569603f-9b29-4342-a289-8484025a2250-operator-scripts\") pod \"root-account-create-update-r9td4\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.904983 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz869\" (UniqueName: \"kubernetes.io/projected/2569603f-9b29-4342-a289-8484025a2250-kube-api-access-tz869\") pod \"root-account-create-update-r9td4\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.905159 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2569603f-9b29-4342-a289-8484025a2250-operator-scripts\") pod \"root-account-create-update-r9td4\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.905609 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xb6t\" (UniqueName: \"kubernetes.io/projected/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-kube-api-access-6xb6t\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.905640 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56d62e78-4aa2-4ae7-84dd-99e58e0deb68-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:12 crc kubenswrapper[4907]: I0226 16:03:12.924963 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz869\" (UniqueName: \"kubernetes.io/projected/2569603f-9b29-4342-a289-8484025a2250-kube-api-access-tz869\") pod \"root-account-create-update-r9td4\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.073066 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.396495 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8plbn" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.396500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8plbn" event={"ID":"56d62e78-4aa2-4ae7-84dd-99e58e0deb68","Type":"ContainerDied","Data":"e70b9a42f673a44d8f2f5d1ce960c3e3fc912b91b08916f74bd5df8896b749ca"} Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.397201 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e70b9a42f673a44d8f2f5d1ce960c3e3fc912b91b08916f74bd5df8896b749ca" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.564381 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-r9td4"] Feb 26 16:03:13 crc kubenswrapper[4907]: W0226 16:03:13.566043 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2569603f_9b29_4342_a289_8484025a2250.slice/crio-a20b0bc9f8b5ac050fdeb40166bec4bf5a4628ea50512eedcb3ced4050818910 WatchSource:0}: Error finding container a20b0bc9f8b5ac050fdeb40166bec4bf5a4628ea50512eedcb3ced4050818910: Status 404 returned error can't find the container with id a20b0bc9f8b5ac050fdeb40166bec4bf5a4628ea50512eedcb3ced4050818910 Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.833428 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.931613 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5wp2\" (UniqueName: \"kubernetes.io/projected/3ad824a7-419c-443b-8278-a4e806370720-kube-api-access-t5wp2\") pod \"3ad824a7-419c-443b-8278-a4e806370720\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.932008 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad824a7-419c-443b-8278-a4e806370720-operator-scripts\") pod \"3ad824a7-419c-443b-8278-a4e806370720\" (UID: \"3ad824a7-419c-443b-8278-a4e806370720\") " Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.932823 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ad824a7-419c-443b-8278-a4e806370720-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3ad824a7-419c-443b-8278-a4e806370720" (UID: "3ad824a7-419c-443b-8278-a4e806370720"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.933460 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3ad824a7-419c-443b-8278-a4e806370720-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:13 crc kubenswrapper[4907]: I0226 16:03:13.949173 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad824a7-419c-443b-8278-a4e806370720-kube-api-access-t5wp2" (OuterVolumeSpecName: "kube-api-access-t5wp2") pod "3ad824a7-419c-443b-8278-a4e806370720" (UID: "3ad824a7-419c-443b-8278-a4e806370720"). InnerVolumeSpecName "kube-api-access-t5wp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.036487 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5wp2\" (UniqueName: \"kubernetes.io/projected/3ad824a7-419c-443b-8278-a4e806370720-kube-api-access-t5wp2\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.118876 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.123840 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.137308 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-operator-scripts\") pod \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.137540 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841c55f4-98a9-44dd-bfc7-018ad4a44528-operator-scripts\") pod \"841c55f4-98a9-44dd-bfc7-018ad4a44528\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.137575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w68qb\" (UniqueName: \"kubernetes.io/projected/841c55f4-98a9-44dd-bfc7-018ad4a44528-kube-api-access-w68qb\") pod \"841c55f4-98a9-44dd-bfc7-018ad4a44528\" (UID: \"841c55f4-98a9-44dd-bfc7-018ad4a44528\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.138225 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bcpr\" (UniqueName: \"kubernetes.io/projected/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-kube-api-access-2bcpr\") pod \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\" (UID: \"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.140803 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" (UID: "8aea8ea2-97bd-4315-9335-8fbe73ab8ec2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.141888 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841c55f4-98a9-44dd-bfc7-018ad4a44528-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "841c55f4-98a9-44dd-bfc7-018ad4a44528" (UID: "841c55f4-98a9-44dd-bfc7-018ad4a44528"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.148807 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841c55f4-98a9-44dd-bfc7-018ad4a44528-kube-api-access-w68qb" (OuterVolumeSpecName: "kube-api-access-w68qb") pod "841c55f4-98a9-44dd-bfc7-018ad4a44528" (UID: "841c55f4-98a9-44dd-bfc7-018ad4a44528"). InnerVolumeSpecName "kube-api-access-w68qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.159400 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.170244 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-kube-api-access-2bcpr" (OuterVolumeSpecName: "kube-api-access-2bcpr") pod "8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" (UID: "8aea8ea2-97bd-4315-9335-8fbe73ab8ec2"). InnerVolumeSpecName "kube-api-access-2bcpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.182343 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.240418 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bjnh\" (UniqueName: \"kubernetes.io/projected/3204495d-0bdb-45bd-b2df-af20221366fd-kube-api-access-6bjnh\") pod \"3204495d-0bdb-45bd-b2df-af20221366fd\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.240472 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3204495d-0bdb-45bd-b2df-af20221366fd-operator-scripts\") pod \"3204495d-0bdb-45bd-b2df-af20221366fd\" (UID: \"3204495d-0bdb-45bd-b2df-af20221366fd\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.240493 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df80d3d-7d86-44dd-a35a-3e9d9d435435-operator-scripts\") pod \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.240535 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g72pw\" (UniqueName: \"kubernetes.io/projected/3df80d3d-7d86-44dd-a35a-3e9d9d435435-kube-api-access-g72pw\") pod \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\" (UID: \"3df80d3d-7d86-44dd-a35a-3e9d9d435435\") " Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241082 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3204495d-0bdb-45bd-b2df-af20221366fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3204495d-0bdb-45bd-b2df-af20221366fd" (UID: "3204495d-0bdb-45bd-b2df-af20221366fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241218 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3df80d3d-7d86-44dd-a35a-3e9d9d435435-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3df80d3d-7d86-44dd-a35a-3e9d9d435435" (UID: "3df80d3d-7d86-44dd-a35a-3e9d9d435435"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241685 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bcpr\" (UniqueName: \"kubernetes.io/projected/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-kube-api-access-2bcpr\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241705 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241716 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/841c55f4-98a9-44dd-bfc7-018ad4a44528-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241725 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w68qb\" (UniqueName: \"kubernetes.io/projected/841c55f4-98a9-44dd-bfc7-018ad4a44528-kube-api-access-w68qb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241734 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3204495d-0bdb-45bd-b2df-af20221366fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.241742 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3df80d3d-7d86-44dd-a35a-3e9d9d435435-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.243268 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3204495d-0bdb-45bd-b2df-af20221366fd-kube-api-access-6bjnh" (OuterVolumeSpecName: "kube-api-access-6bjnh") pod "3204495d-0bdb-45bd-b2df-af20221366fd" (UID: "3204495d-0bdb-45bd-b2df-af20221366fd"). InnerVolumeSpecName "kube-api-access-6bjnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.244807 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df80d3d-7d86-44dd-a35a-3e9d9d435435-kube-api-access-g72pw" (OuterVolumeSpecName: "kube-api-access-g72pw") pod "3df80d3d-7d86-44dd-a35a-3e9d9d435435" (UID: "3df80d3d-7d86-44dd-a35a-3e9d9d435435"). InnerVolumeSpecName "kube-api-access-g72pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.343348 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g72pw\" (UniqueName: \"kubernetes.io/projected/3df80d3d-7d86-44dd-a35a-3e9d9d435435-kube-api-access-g72pw\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.343398 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bjnh\" (UniqueName: \"kubernetes.io/projected/3204495d-0bdb-45bd-b2df-af20221366fd-kube-api-access-6bjnh\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.412977 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8hgqp" event={"ID":"3ad824a7-419c-443b-8278-a4e806370720","Type":"ContainerDied","Data":"d3e71850bba21abd205c7803f5d5f7eb20a7eeec77604ecfda8c9882fb8b2b9c"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.413042 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3e71850bba21abd205c7803f5d5f7eb20a7eeec77604ecfda8c9882fb8b2b9c" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.413285 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8hgqp" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.427489 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9td4" event={"ID":"2569603f-9b29-4342-a289-8484025a2250","Type":"ContainerStarted","Data":"9cc7bb5362346ca08fbf3ac2c9729be8d3c6896f7dc05d217c7d67f0078f3f7c"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.427803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9td4" event={"ID":"2569603f-9b29-4342-a289-8484025a2250","Type":"ContainerStarted","Data":"a20b0bc9f8b5ac050fdeb40166bec4bf5a4628ea50512eedcb3ced4050818910"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.436759 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4cbgw" event={"ID":"3df80d3d-7d86-44dd-a35a-3e9d9d435435","Type":"ContainerDied","Data":"14a34563357e04f0033ddc22eca8c76eb501b92c36303d695e1f6573f1eedb62"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.436805 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a34563357e04f0033ddc22eca8c76eb501b92c36303d695e1f6573f1eedb62" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.436810 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4cbgw" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.439025 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d824-account-create-update-x9gtq" event={"ID":"3204495d-0bdb-45bd-b2df-af20221366fd","Type":"ContainerDied","Data":"0985c406710137650fd2fa88f373e33133d88bc98620eb6b157686c4b2388621"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.439038 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d824-account-create-update-x9gtq" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.439055 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0985c406710137650fd2fa88f373e33133d88bc98620eb6b157686c4b2388621" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.440543 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f375-account-create-update-rgltk" event={"ID":"841c55f4-98a9-44dd-bfc7-018ad4a44528","Type":"ContainerDied","Data":"fd3cfb6a8f1f30d04fcfbe8807352f9a671e710ce34a88b2726b723c3d00ba92"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.440563 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd3cfb6a8f1f30d04fcfbe8807352f9a671e710ce34a88b2726b723c3d00ba92" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.440632 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f375-account-create-update-rgltk" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.442472 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0813-account-create-update-rz8gs" event={"ID":"8aea8ea2-97bd-4315-9335-8fbe73ab8ec2","Type":"ContainerDied","Data":"862f24c33984768a364fa27989f3124d0eb6356670b144e8f3b058c93b66d84e"} Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.442492 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="862f24c33984768a364fa27989f3124d0eb6356670b144e8f3b058c93b66d84e" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.442523 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0813-account-create-update-rz8gs" Feb 26 16:03:14 crc kubenswrapper[4907]: I0226 16:03:14.489093 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-r9td4" podStartSLOduration=2.489074128 podStartE2EDuration="2.489074128s" podCreationTimestamp="2026-02-26 16:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:14.46484087 +0000 UTC m=+1256.983402719" watchObservedRunningTime="2026-02-26 16:03:14.489074128 +0000 UTC m=+1257.007635977" Feb 26 16:03:15 crc kubenswrapper[4907]: I0226 16:03:15.451052 4907 generic.go:334] "Generic (PLEG): container finished" podID="2569603f-9b29-4342-a289-8484025a2250" containerID="9cc7bb5362346ca08fbf3ac2c9729be8d3c6896f7dc05d217c7d67f0078f3f7c" exitCode=0 Feb 26 16:03:15 crc kubenswrapper[4907]: I0226 16:03:15.451135 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9td4" event={"ID":"2569603f-9b29-4342-a289-8484025a2250","Type":"ContainerDied","Data":"9cc7bb5362346ca08fbf3ac2c9729be8d3c6896f7dc05d217c7d67f0078f3f7c"} Feb 26 16:03:15 crc kubenswrapper[4907]: I0226 16:03:15.473839 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:15 crc kubenswrapper[4907]: E0226 16:03:15.474044 4907 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 16:03:15 crc kubenswrapper[4907]: E0226 16:03:15.474066 4907 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 16:03:15 crc kubenswrapper[4907]: E0226 16:03:15.474127 4907 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift podName:819c7fec-fd22-478a-bf6c-f4cb5aeccc59 nodeName:}" failed. No retries permitted until 2026-02-26 16:03:31.474107921 +0000 UTC m=+1273.992669770 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift") pod "swift-storage-0" (UID: "819c7fec-fd22-478a-bf6c-f4cb5aeccc59") : configmap "swift-ring-files" not found Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.128888 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-drng5" podUID="66d3c733-f440-4877-9e7b-af62f5dc7857" containerName="ovn-controller" probeResult="failure" output=< Feb 26 16:03:16 crc kubenswrapper[4907]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 16:03:16 crc kubenswrapper[4907]: > Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.169889 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.172452 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9qr64" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398084 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-drng5-config-cbbrt"] Feb 26 16:03:16 crc kubenswrapper[4907]: E0226 16:03:16.398493 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad824a7-419c-443b-8278-a4e806370720" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398514 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad824a7-419c-443b-8278-a4e806370720" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: E0226 16:03:16.398540 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df80d3d-7d86-44dd-a35a-3e9d9d435435" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398549 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df80d3d-7d86-44dd-a35a-3e9d9d435435" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: E0226 16:03:16.398562 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841c55f4-98a9-44dd-bfc7-018ad4a44528" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398570 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="841c55f4-98a9-44dd-bfc7-018ad4a44528" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: E0226 16:03:16.398601 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398610 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: E0226 16:03:16.398623 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d62e78-4aa2-4ae7-84dd-99e58e0deb68" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398630 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d62e78-4aa2-4ae7-84dd-99e58e0deb68" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: E0226 16:03:16.398649 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3204495d-0bdb-45bd-b2df-af20221366fd" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398658 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3204495d-0bdb-45bd-b2df-af20221366fd" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398871 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad824a7-419c-443b-8278-a4e806370720" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398888 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398897 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df80d3d-7d86-44dd-a35a-3e9d9d435435" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398908 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3204495d-0bdb-45bd-b2df-af20221366fd" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398920 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="841c55f4-98a9-44dd-bfc7-018ad4a44528" containerName="mariadb-account-create-update" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.398934 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d62e78-4aa2-4ae7-84dd-99e58e0deb68" containerName="mariadb-database-create" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.399581 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.402320 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.406311 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-drng5-config-cbbrt"] Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.489262 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-log-ovn\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.489335 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-additional-scripts\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.489362 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.489417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-scripts\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.489486 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run-ovn\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.489508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr2sh\" (UniqueName: \"kubernetes.io/projected/74e637b6-f732-4034-9f5b-27c56756e70d-kube-api-access-lr2sh\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591233 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-additional-scripts\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591292 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-scripts\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591426 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run-ovn\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591460 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr2sh\" (UniqueName: \"kubernetes.io/projected/74e637b6-f732-4034-9f5b-27c56756e70d-kube-api-access-lr2sh\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-log-ovn\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.591837 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-log-ovn\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.592484 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-additional-scripts\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.592540 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.594552 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-scripts\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.594624 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run-ovn\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.621806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr2sh\" (UniqueName: \"kubernetes.io/projected/74e637b6-f732-4034-9f5b-27c56756e70d-kube-api-access-lr2sh\") pod \"ovn-controller-drng5-config-cbbrt\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.709281 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hdzvj"] Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.710243 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.712812 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.726075 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.726217 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2tmh7" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.735180 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hdzvj"] Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.797746 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b55r\" (UniqueName: \"kubernetes.io/projected/2395dfd1-7840-4703-a1c9-37c6eff664bd-kube-api-access-6b55r\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.797905 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-combined-ca-bundle\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.797997 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-db-sync-config-data\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.798020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-config-data\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.889395 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.901943 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-db-sync-config-data\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.901981 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-config-data\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.902173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b55r\" (UniqueName: \"kubernetes.io/projected/2395dfd1-7840-4703-a1c9-37c6eff664bd-kube-api-access-6b55r\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.902401 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-combined-ca-bundle\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.911046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-combined-ca-bundle\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.911650 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-config-data\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.912116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-db-sync-config-data\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:16 crc kubenswrapper[4907]: I0226 16:03:16.934075 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b55r\" (UniqueName: \"kubernetes.io/projected/2395dfd1-7840-4703-a1c9-37c6eff664bd-kube-api-access-6b55r\") pod \"glance-db-sync-hdzvj\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.003954 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2569603f-9b29-4342-a289-8484025a2250-operator-scripts\") pod \"2569603f-9b29-4342-a289-8484025a2250\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.004060 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz869\" (UniqueName: \"kubernetes.io/projected/2569603f-9b29-4342-a289-8484025a2250-kube-api-access-tz869\") pod \"2569603f-9b29-4342-a289-8484025a2250\" (UID: \"2569603f-9b29-4342-a289-8484025a2250\") " Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.006213 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2569603f-9b29-4342-a289-8484025a2250-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2569603f-9b29-4342-a289-8484025a2250" (UID: "2569603f-9b29-4342-a289-8484025a2250"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.007721 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2569603f-9b29-4342-a289-8484025a2250-kube-api-access-tz869" (OuterVolumeSpecName: "kube-api-access-tz869") pod "2569603f-9b29-4342-a289-8484025a2250" (UID: "2569603f-9b29-4342-a289-8484025a2250"). InnerVolumeSpecName "kube-api-access-tz869". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.057045 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.105911 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz869\" (UniqueName: \"kubernetes.io/projected/2569603f-9b29-4342-a289-8484025a2250-kube-api-access-tz869\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.105938 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2569603f-9b29-4342-a289-8484025a2250-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.213253 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-drng5-config-cbbrt"] Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.466431 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-r9td4" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.466466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-r9td4" event={"ID":"2569603f-9b29-4342-a289-8484025a2250","Type":"ContainerDied","Data":"a20b0bc9f8b5ac050fdeb40166bec4bf5a4628ea50512eedcb3ced4050818910"} Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.466857 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20b0bc9f8b5ac050fdeb40166bec4bf5a4628ea50512eedcb3ced4050818910" Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.467394 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5-config-cbbrt" event={"ID":"74e637b6-f732-4034-9f5b-27c56756e70d","Type":"ContainerStarted","Data":"37378e778f1ca59d98a5c116a3b0b9a5005805e86bb2238e79c6e791520b3dfb"} Feb 26 16:03:17 crc kubenswrapper[4907]: I0226 16:03:17.617704 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hdzvj"] Feb 26 16:03:17 crc kubenswrapper[4907]: W0226 16:03:17.621895 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2395dfd1_7840_4703_a1c9_37c6eff664bd.slice/crio-9a8a8a40d69c06551530c88809ac79085a257ca760af2d356b4ea3d8665b6a13 WatchSource:0}: Error finding container 9a8a8a40d69c06551530c88809ac79085a257ca760af2d356b4ea3d8665b6a13: Status 404 returned error can't find the container with id 9a8a8a40d69c06551530c88809ac79085a257ca760af2d356b4ea3d8665b6a13 Feb 26 16:03:18 crc kubenswrapper[4907]: I0226 16:03:18.477748 4907 generic.go:334] "Generic (PLEG): container finished" podID="74e637b6-f732-4034-9f5b-27c56756e70d" containerID="60555399c60d59b9505adf79bd8540d91978a0ca1c4ac2cf50c79fea6ee3e31d" exitCode=0 Feb 26 16:03:18 crc kubenswrapper[4907]: I0226 16:03:18.478237 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5-config-cbbrt" event={"ID":"74e637b6-f732-4034-9f5b-27c56756e70d","Type":"ContainerDied","Data":"60555399c60d59b9505adf79bd8540d91978a0ca1c4ac2cf50c79fea6ee3e31d"} Feb 26 16:03:18 crc kubenswrapper[4907]: I0226 16:03:18.482682 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hdzvj" event={"ID":"2395dfd1-7840-4703-a1c9-37c6eff664bd","Type":"ContainerStarted","Data":"9a8a8a40d69c06551530c88809ac79085a257ca760af2d356b4ea3d8665b6a13"} Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.127331 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-r9td4"] Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.135516 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-r9td4"] Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.490175 4907 generic.go:334] "Generic (PLEG): container finished" podID="c5f9c74c-c90c-40ba-9548-dc79f90592a4" containerID="5b6065f7d0a91b56a02d3ad85d8cc3fd84864465349926f69fad86a45ff01f07" exitCode=0 Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.490369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zj4xn" event={"ID":"c5f9c74c-c90c-40ba-9548-dc79f90592a4","Type":"ContainerDied","Data":"5b6065f7d0a91b56a02d3ad85d8cc3fd84864465349926f69fad86a45ff01f07"} Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.792397 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.856474 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-log-ovn\") pod \"74e637b6-f732-4034-9f5b-27c56756e70d\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.856572 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-scripts\") pod \"74e637b6-f732-4034-9f5b-27c56756e70d\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.856629 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-additional-scripts\") pod \"74e637b6-f732-4034-9f5b-27c56756e70d\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.856684 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run-ovn\") pod \"74e637b6-f732-4034-9f5b-27c56756e70d\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.856701 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run\") pod \"74e637b6-f732-4034-9f5b-27c56756e70d\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.856740 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr2sh\" (UniqueName: \"kubernetes.io/projected/74e637b6-f732-4034-9f5b-27c56756e70d-kube-api-access-lr2sh\") pod \"74e637b6-f732-4034-9f5b-27c56756e70d\" (UID: \"74e637b6-f732-4034-9f5b-27c56756e70d\") " Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.857771 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "74e637b6-f732-4034-9f5b-27c56756e70d" (UID: "74e637b6-f732-4034-9f5b-27c56756e70d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.857805 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run" (OuterVolumeSpecName: "var-run") pod "74e637b6-f732-4034-9f5b-27c56756e70d" (UID: "74e637b6-f732-4034-9f5b-27c56756e70d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.857821 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "74e637b6-f732-4034-9f5b-27c56756e70d" (UID: "74e637b6-f732-4034-9f5b-27c56756e70d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.857912 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "74e637b6-f732-4034-9f5b-27c56756e70d" (UID: "74e637b6-f732-4034-9f5b-27c56756e70d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.858508 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-scripts" (OuterVolumeSpecName: "scripts") pod "74e637b6-f732-4034-9f5b-27c56756e70d" (UID: "74e637b6-f732-4034-9f5b-27c56756e70d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.862776 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e637b6-f732-4034-9f5b-27c56756e70d-kube-api-access-lr2sh" (OuterVolumeSpecName: "kube-api-access-lr2sh") pod "74e637b6-f732-4034-9f5b-27c56756e70d" (UID: "74e637b6-f732-4034-9f5b-27c56756e70d"). InnerVolumeSpecName "kube-api-access-lr2sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.958179 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.958525 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.958539 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/74e637b6-f732-4034-9f5b-27c56756e70d-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.958554 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.958565 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74e637b6-f732-4034-9f5b-27c56756e70d-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:19 crc kubenswrapper[4907]: I0226 16:03:19.958573 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr2sh\" (UniqueName: \"kubernetes.io/projected/74e637b6-f732-4034-9f5b-27c56756e70d-kube-api-access-lr2sh\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.139580 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2569603f-9b29-4342-a289-8484025a2250" path="/var/lib/kubelet/pods/2569603f-9b29-4342-a289-8484025a2250/volumes" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.499264 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-cbbrt" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.499326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5-config-cbbrt" event={"ID":"74e637b6-f732-4034-9f5b-27c56756e70d","Type":"ContainerDied","Data":"37378e778f1ca59d98a5c116a3b0b9a5005805e86bb2238e79c6e791520b3dfb"} Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.499387 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37378e778f1ca59d98a5c116a3b0b9a5005805e86bb2238e79c6e791520b3dfb" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.796064 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.907970 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-drng5-config-cbbrt"] Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.917861 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-drng5-config-cbbrt"] Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975383 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-scripts\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975446 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfdt8\" (UniqueName: \"kubernetes.io/projected/c5f9c74c-c90c-40ba-9548-dc79f90592a4-kube-api-access-lfdt8\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-combined-ca-bundle\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975637 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-ring-data-devices\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975667 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-swiftconf\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975691 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5f9c74c-c90c-40ba-9548-dc79f90592a4-etc-swift\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.975752 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-dispersionconf\") pod \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\" (UID: \"c5f9c74c-c90c-40ba-9548-dc79f90592a4\") " Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.978386 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5f9c74c-c90c-40ba-9548-dc79f90592a4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.979933 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.983646 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f9c74c-c90c-40ba-9548-dc79f90592a4-kube-api-access-lfdt8" (OuterVolumeSpecName: "kube-api-access-lfdt8") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "kube-api-access-lfdt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:20 crc kubenswrapper[4907]: I0226 16:03:20.992082 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.007966 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-scripts" (OuterVolumeSpecName: "scripts") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.008254 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-drng5-config-m5qbn"] Feb 26 16:03:21 crc kubenswrapper[4907]: E0226 16:03:21.013744 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e637b6-f732-4034-9f5b-27c56756e70d" containerName="ovn-config" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.013908 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e637b6-f732-4034-9f5b-27c56756e70d" containerName="ovn-config" Feb 26 16:03:21 crc kubenswrapper[4907]: E0226 16:03:21.014025 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2569603f-9b29-4342-a289-8484025a2250" containerName="mariadb-account-create-update" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.014104 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2569603f-9b29-4342-a289-8484025a2250" containerName="mariadb-account-create-update" Feb 26 16:03:21 crc kubenswrapper[4907]: E0226 16:03:21.014175 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f9c74c-c90c-40ba-9548-dc79f90592a4" containerName="swift-ring-rebalance" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.014231 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f9c74c-c90c-40ba-9548-dc79f90592a4" containerName="swift-ring-rebalance" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.014540 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2569603f-9b29-4342-a289-8484025a2250" containerName="mariadb-account-create-update" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.014643 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e637b6-f732-4034-9f5b-27c56756e70d" containerName="ovn-config" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.014727 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f9c74c-c90c-40ba-9548-dc79f90592a4" containerName="swift-ring-rebalance" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.015535 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.022511 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.023162 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-drng5-config-m5qbn"] Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.038253 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.039104 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c5f9c74c-c90c-40ba-9548-dc79f90592a4" (UID: "c5f9c74c-c90c-40ba-9548-dc79f90592a4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run-ovn\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077696 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-additional-scripts\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077737 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-log-ovn\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077765 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-scripts\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97lp\" (UniqueName: \"kubernetes.io/projected/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-kube-api-access-x97lp\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077882 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077892 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfdt8\" (UniqueName: \"kubernetes.io/projected/c5f9c74c-c90c-40ba-9548-dc79f90592a4-kube-api-access-lfdt8\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077905 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077914 4907 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c5f9c74c-c90c-40ba-9548-dc79f90592a4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077921 4907 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077930 4907 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c5f9c74c-c90c-40ba-9548-dc79f90592a4-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.077938 4907 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c5f9c74c-c90c-40ba-9548-dc79f90592a4-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.140494 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-drng5" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.198346 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run-ovn\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.198674 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-additional-scripts\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.200979 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run-ovn\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.201206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.201364 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-log-ovn\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.201541 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-scripts\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.201869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x97lp\" (UniqueName: \"kubernetes.io/projected/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-kube-api-access-x97lp\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.203359 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-additional-scripts\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.203425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.203938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-log-ovn\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.214162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-scripts\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.246713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97lp\" (UniqueName: \"kubernetes.io/projected/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-kube-api-access-x97lp\") pod \"ovn-controller-drng5-config-m5qbn\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.434990 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.507858 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-zj4xn" event={"ID":"c5f9c74c-c90c-40ba-9548-dc79f90592a4","Type":"ContainerDied","Data":"22e11871b81b84490819a0fa5a32ddce14c902572365fe8cce2c4b5a9204aead"} Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.508241 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22e11871b81b84490819a0fa5a32ddce14c902572365fe8cce2c4b5a9204aead" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.507950 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-zj4xn" Feb 26 16:03:21 crc kubenswrapper[4907]: I0226 16:03:21.718186 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-drng5-config-m5qbn"] Feb 26 16:03:21 crc kubenswrapper[4907]: W0226 16:03:21.719477 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e15ab83_ad2e_4ea4_a393_0ded9fd3dfe2.slice/crio-c9b90c0cde1fc88964ae356de886d7a02df68dcaa6fd71a12a84301a0f619c0c WatchSource:0}: Error finding container c9b90c0cde1fc88964ae356de886d7a02df68dcaa6fd71a12a84301a0f619c0c: Status 404 returned error can't find the container with id c9b90c0cde1fc88964ae356de886d7a02df68dcaa6fd71a12a84301a0f619c0c Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.135951 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e637b6-f732-4034-9f5b-27c56756e70d" path="/var/lib/kubelet/pods/74e637b6-f732-4034-9f5b-27c56756e70d/volumes" Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.158430 4907 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podff2c92c6-ced4-4d3e-91c3-7745376793eb"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podff2c92c6-ced4-4d3e-91c3-7745376793eb] : Timed out while waiting for systemd to remove kubepods-besteffort-podff2c92c6_ced4_4d3e_91c3_7745376793eb.slice" Feb 26 16:03:22 crc kubenswrapper[4907]: E0226 16:03:22.158487 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podff2c92c6-ced4-4d3e-91c3-7745376793eb] : unable to destroy cgroup paths for cgroup [kubepods besteffort podff2c92c6-ced4-4d3e-91c3-7745376793eb] : Timed out while waiting for systemd to remove kubepods-besteffort-podff2c92c6_ced4_4d3e_91c3_7745376793eb.slice" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" podUID="ff2c92c6-ced4-4d3e-91c3-7745376793eb" Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.531204 4907 generic.go:334] "Generic (PLEG): container finished" podID="96ba881c-449c-4300-b67f-8a1e952af508" containerID="e28a3b8c761243a769d04d190d2ae365641bcbb802321434379486662fc95053" exitCode=0 Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.531298 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96ba881c-449c-4300-b67f-8a1e952af508","Type":"ContainerDied","Data":"e28a3b8c761243a769d04d190d2ae365641bcbb802321434379486662fc95053"} Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.535978 4907 generic.go:334] "Generic (PLEG): container finished" podID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerID="ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77" exitCode=0 Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.536107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cca4ff23-cabb-466c-80a0-dbcc1f005123","Type":"ContainerDied","Data":"ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77"} Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.539867 4907 generic.go:334] "Generic (PLEG): container finished" podID="4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" containerID="0a5d6f60f71c3e5324ad26a0af022cce4cf805448af7ede8f99bf4081a825aaa" exitCode=0 Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.539944 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-sfq9z" Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.539975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5-config-m5qbn" event={"ID":"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2","Type":"ContainerDied","Data":"0a5d6f60f71c3e5324ad26a0af022cce4cf805448af7ede8f99bf4081a825aaa"} Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.540001 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5-config-m5qbn" event={"ID":"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2","Type":"ContainerStarted","Data":"c9b90c0cde1fc88964ae356de886d7a02df68dcaa6fd71a12a84301a0f619c0c"} Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.700963 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sfq9z"] Feb 26 16:03:22 crc kubenswrapper[4907]: I0226 16:03:22.711419 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-sfq9z"] Feb 26 16:03:23 crc kubenswrapper[4907]: I0226 16:03:23.551648 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cca4ff23-cabb-466c-80a0-dbcc1f005123","Type":"ContainerStarted","Data":"e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3"} Feb 26 16:03:23 crc kubenswrapper[4907]: I0226 16:03:23.551899 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:03:23 crc kubenswrapper[4907]: I0226 16:03:23.560002 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96ba881c-449c-4300-b67f-8a1e952af508","Type":"ContainerStarted","Data":"1acc9ffabff45e7a23fbd242599d1693e106acd3557c6aa619db091bb41fc243"} Feb 26 16:03:23 crc kubenswrapper[4907]: I0226 16:03:23.560233 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 16:03:23 crc kubenswrapper[4907]: I0226 16:03:23.598491 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.632652319 podStartE2EDuration="1m22.598467201s" podCreationTimestamp="2026-02-26 16:02:01 +0000 UTC" firstStartedPulling="2026-02-26 16:02:03.562555236 +0000 UTC m=+1186.081117086" lastFinishedPulling="2026-02-26 16:02:48.528370119 +0000 UTC m=+1231.046931968" observedRunningTime="2026-02-26 16:03:23.591939763 +0000 UTC m=+1266.110501612" watchObservedRunningTime="2026-02-26 16:03:23.598467201 +0000 UTC m=+1266.117029060" Feb 26 16:03:23 crc kubenswrapper[4907]: I0226 16:03:23.638209 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.381823626 podStartE2EDuration="1m22.638186746s" podCreationTimestamp="2026-02-26 16:02:01 +0000 UTC" firstStartedPulling="2026-02-26 16:02:03.383255251 +0000 UTC m=+1185.901817100" lastFinishedPulling="2026-02-26 16:02:48.639618371 +0000 UTC m=+1231.158180220" observedRunningTime="2026-02-26 16:03:23.633886052 +0000 UTC m=+1266.152447921" watchObservedRunningTime="2026-02-26 16:03:23.638186746 +0000 UTC m=+1266.156748595" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.137953 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff2c92c6-ced4-4d3e-91c3-7745376793eb" path="/var/lib/kubelet/pods/ff2c92c6-ced4-4d3e-91c3-7745376793eb/volumes" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.138486 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-ck6c5"] Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.139434 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.140880 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.155769 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ck6c5"] Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.262565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f69ph\" (UniqueName: \"kubernetes.io/projected/014002bc-6d56-41a9-969b-b6607aa0aa71-kube-api-access-f69ph\") pod \"root-account-create-update-ck6c5\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.262721 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/014002bc-6d56-41a9-969b-b6607aa0aa71-operator-scripts\") pod \"root-account-create-update-ck6c5\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.364005 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f69ph\" (UniqueName: \"kubernetes.io/projected/014002bc-6d56-41a9-969b-b6607aa0aa71-kube-api-access-f69ph\") pod \"root-account-create-update-ck6c5\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.364173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/014002bc-6d56-41a9-969b-b6607aa0aa71-operator-scripts\") pod \"root-account-create-update-ck6c5\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.364874 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/014002bc-6d56-41a9-969b-b6607aa0aa71-operator-scripts\") pod \"root-account-create-update-ck6c5\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.389818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f69ph\" (UniqueName: \"kubernetes.io/projected/014002bc-6d56-41a9-969b-b6607aa0aa71-kube-api-access-f69ph\") pod \"root-account-create-update-ck6c5\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:24 crc kubenswrapper[4907]: I0226 16:03:24.456915 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:31 crc kubenswrapper[4907]: I0226 16:03:31.533812 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:31 crc kubenswrapper[4907]: I0226 16:03:31.540136 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/819c7fec-fd22-478a-bf6c-f4cb5aeccc59-etc-swift\") pod \"swift-storage-0\" (UID: \"819c7fec-fd22-478a-bf6c-f4cb5aeccc59\") " pod="openstack/swift-storage-0" Feb 26 16:03:31 crc kubenswrapper[4907]: I0226 16:03:31.658383 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.639091 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-drng5-config-m5qbn" event={"ID":"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2","Type":"ContainerDied","Data":"c9b90c0cde1fc88964ae356de886d7a02df68dcaa6fd71a12a84301a0f619c0c"} Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.639134 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b90c0cde1fc88964ae356de886d7a02df68dcaa6fd71a12a84301a0f619c0c" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.720629 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.862484 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run-ovn\") pod \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.862669 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" (UID: "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.862773 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run" (OuterVolumeSpecName: "var-run") pod "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" (UID: "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.862599 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run\") pod \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.862844 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-log-ovn\") pod \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.862982 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x97lp\" (UniqueName: \"kubernetes.io/projected/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-kube-api-access-x97lp\") pod \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.863023 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-additional-scripts\") pod \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.863271 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-scripts\") pod \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\" (UID: \"4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2\") " Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.863800 4907 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.863820 4907 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.864964 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-scripts" (OuterVolumeSpecName: "scripts") pod "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" (UID: "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.864997 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" (UID: "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.866022 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" (UID: "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.879394 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-kube-api-access-x97lp" (OuterVolumeSpecName: "kube-api-access-x97lp") pod "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" (UID: "4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2"). InnerVolumeSpecName "kube-api-access-x97lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.966075 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x97lp\" (UniqueName: \"kubernetes.io/projected/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-kube-api-access-x97lp\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.966115 4907 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.966126 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:32 crc kubenswrapper[4907]: I0226 16:03:32.966147 4907 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.232669 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.267536 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-ck6c5"] Feb 26 16:03:33 crc kubenswrapper[4907]: W0226 16:03:33.277879 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod014002bc_6d56_41a9_969b_b6607aa0aa71.slice/crio-ed8be51cc1e0bb385b4253bc1e11c90fa72eb99de813239ca8fa38cc2d9d842b WatchSource:0}: Error finding container ed8be51cc1e0bb385b4253bc1e11c90fa72eb99de813239ca8fa38cc2d9d842b: Status 404 returned error can't find the container with id ed8be51cc1e0bb385b4253bc1e11c90fa72eb99de813239ca8fa38cc2d9d842b Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.647261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ck6c5" event={"ID":"014002bc-6d56-41a9-969b-b6607aa0aa71","Type":"ContainerStarted","Data":"08cabaedf40630d23f28757a7e22a68606ac3ac825a025914a7d4c3be249cba6"} Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.647307 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ck6c5" event={"ID":"014002bc-6d56-41a9-969b-b6607aa0aa71","Type":"ContainerStarted","Data":"ed8be51cc1e0bb385b4253bc1e11c90fa72eb99de813239ca8fa38cc2d9d842b"} Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.650922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hdzvj" event={"ID":"2395dfd1-7840-4703-a1c9-37c6eff664bd","Type":"ContainerStarted","Data":"63496136caf0de20beb55d60c8d05550ef8b6597390d822d2b0105cdf73169db"} Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.651949 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"9e63bd63f9a1d871e75ee2c2868e5b84152cbde4c4a0df84bed2b6da16b56e00"} Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.651987 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-drng5-config-m5qbn" Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.686131 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-ck6c5" podStartSLOduration=9.686115643 podStartE2EDuration="9.686115643s" podCreationTimestamp="2026-02-26 16:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:33.668999107 +0000 UTC m=+1276.187560966" watchObservedRunningTime="2026-02-26 16:03:33.686115643 +0000 UTC m=+1276.204677492" Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.688385 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hdzvj" podStartSLOduration=2.622358251 podStartE2EDuration="17.688377448s" podCreationTimestamp="2026-02-26 16:03:16 +0000 UTC" firstStartedPulling="2026-02-26 16:03:17.624942437 +0000 UTC m=+1260.143504286" lastFinishedPulling="2026-02-26 16:03:32.690961634 +0000 UTC m=+1275.209523483" observedRunningTime="2026-02-26 16:03:33.685136199 +0000 UTC m=+1276.203698058" watchObservedRunningTime="2026-02-26 16:03:33.688377448 +0000 UTC m=+1276.206939297" Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.808284 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-drng5-config-m5qbn"] Feb 26 16:03:33 crc kubenswrapper[4907]: I0226 16:03:33.815748 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-drng5-config-m5qbn"] Feb 26 16:03:34 crc kubenswrapper[4907]: I0226 16:03:34.139166 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" path="/var/lib/kubelet/pods/4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2/volumes" Feb 26 16:03:34 crc kubenswrapper[4907]: I0226 16:03:34.661253 4907 generic.go:334] "Generic (PLEG): container finished" podID="014002bc-6d56-41a9-969b-b6607aa0aa71" containerID="08cabaedf40630d23f28757a7e22a68606ac3ac825a025914a7d4c3be249cba6" exitCode=0 Feb 26 16:03:34 crc kubenswrapper[4907]: I0226 16:03:34.661385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ck6c5" event={"ID":"014002bc-6d56-41a9-969b-b6607aa0aa71","Type":"ContainerDied","Data":"08cabaedf40630d23f28757a7e22a68606ac3ac825a025914a7d4c3be249cba6"} Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.219879 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.323247 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f69ph\" (UniqueName: \"kubernetes.io/projected/014002bc-6d56-41a9-969b-b6607aa0aa71-kube-api-access-f69ph\") pod \"014002bc-6d56-41a9-969b-b6607aa0aa71\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.323490 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/014002bc-6d56-41a9-969b-b6607aa0aa71-operator-scripts\") pod \"014002bc-6d56-41a9-969b-b6607aa0aa71\" (UID: \"014002bc-6d56-41a9-969b-b6607aa0aa71\") " Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.324614 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014002bc-6d56-41a9-969b-b6607aa0aa71-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "014002bc-6d56-41a9-969b-b6607aa0aa71" (UID: "014002bc-6d56-41a9-969b-b6607aa0aa71"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.329841 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014002bc-6d56-41a9-969b-b6607aa0aa71-kube-api-access-f69ph" (OuterVolumeSpecName: "kube-api-access-f69ph") pod "014002bc-6d56-41a9-969b-b6607aa0aa71" (UID: "014002bc-6d56-41a9-969b-b6607aa0aa71"). InnerVolumeSpecName "kube-api-access-f69ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.425508 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/014002bc-6d56-41a9-969b-b6607aa0aa71-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.425570 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f69ph\" (UniqueName: \"kubernetes.io/projected/014002bc-6d56-41a9-969b-b6607aa0aa71-kube-api-access-f69ph\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.698582 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-ck6c5" event={"ID":"014002bc-6d56-41a9-969b-b6607aa0aa71","Type":"ContainerDied","Data":"ed8be51cc1e0bb385b4253bc1e11c90fa72eb99de813239ca8fa38cc2d9d842b"} Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.699072 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed8be51cc1e0bb385b4253bc1e11c90fa72eb99de813239ca8fa38cc2d9d842b" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.698582 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-ck6c5" Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.702577 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"8176a87aad2c181f5c976381c220ed01bf9c0a48ab16c10494df3dc235b00804"} Feb 26 16:03:36 crc kubenswrapper[4907]: I0226 16:03:36.702630 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"9e95f0c91690dfd157cf8b58eab925cf818c9b6d7b0df2c06b3a94b632dda0ee"} Feb 26 16:03:37 crc kubenswrapper[4907]: I0226 16:03:37.712437 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"8fb01484b9e08bdb67608f9e086ce296fe862a6763c31c4c6ef2087818bb44fb"} Feb 26 16:03:38 crc kubenswrapper[4907]: I0226 16:03:38.725038 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"54bbddffbc2cc7850342cb28ae480ef1c9c6d04ebae9297b46e10d77ffe52a70"} Feb 26 16:03:40 crc kubenswrapper[4907]: I0226 16:03:40.745059 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"efcc3940626e4efa5b8949eb429e04388980188fbdc7e0baadab19780261d159"} Feb 26 16:03:40 crc kubenswrapper[4907]: I0226 16:03:40.745374 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"2b81e16776f2f838416ad5bb382cb24f734b6fb7c30eeb547e8e360733c4d5e1"} Feb 26 16:03:40 crc kubenswrapper[4907]: I0226 16:03:40.745388 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"3e2a4ec55f34988cc2a72954dbeb8177ea155b738a78f9602111e2f4739063f0"} Feb 26 16:03:40 crc kubenswrapper[4907]: I0226 16:03:40.745398 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"a64e81d86f61d708667e683d9e79b11f55126abc35f33e190923d49a55e8b293"} Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.538844 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.778410 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"d9906ddf79248c208191b302111d354fb132c721a4dd3c3cbd05353b50e8e8b7"} Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.778461 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"b91fba2c43a65f4ddace12b159b86e97ede1b13616ffb26eb3a0e441a16ca5c6"} Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.891365 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.945794 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gcc4z"] Feb 26 16:03:42 crc kubenswrapper[4907]: E0226 16:03:42.946148 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" containerName="ovn-config" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.946167 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" containerName="ovn-config" Feb 26 16:03:42 crc kubenswrapper[4907]: E0226 16:03:42.946177 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014002bc-6d56-41a9-969b-b6607aa0aa71" containerName="mariadb-account-create-update" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.946184 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="014002bc-6d56-41a9-969b-b6607aa0aa71" containerName="mariadb-account-create-update" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.946318 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e15ab83-ad2e-4ea4-a393-0ded9fd3dfe2" containerName="ovn-config" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.946336 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="014002bc-6d56-41a9-969b-b6607aa0aa71" containerName="mariadb-account-create-update" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.946847 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:42 crc kubenswrapper[4907]: I0226 16:03:42.983015 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gcc4z"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.016721 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-0936-account-create-update-zlgpv"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.018707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.037638 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.047032 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0936-account-create-update-zlgpv"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.072394 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mkf4\" (UniqueName: \"kubernetes.io/projected/7102918b-1c33-4b66-9767-fcf854b0f666-kube-api-access-8mkf4\") pod \"cinder-db-create-gcc4z\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.073015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7102918b-1c33-4b66-9767-fcf854b0f666-operator-scripts\") pod \"cinder-db-create-gcc4z\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.174923 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-operator-scripts\") pod \"cinder-0936-account-create-update-zlgpv\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.175001 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mkf4\" (UniqueName: \"kubernetes.io/projected/7102918b-1c33-4b66-9767-fcf854b0f666-kube-api-access-8mkf4\") pod \"cinder-db-create-gcc4z\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.175044 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7102918b-1c33-4b66-9767-fcf854b0f666-operator-scripts\") pod \"cinder-db-create-gcc4z\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.175078 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fmjh\" (UniqueName: \"kubernetes.io/projected/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-kube-api-access-4fmjh\") pod \"cinder-0936-account-create-update-zlgpv\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.175712 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7102918b-1c33-4b66-9767-fcf854b0f666-operator-scripts\") pod \"cinder-db-create-gcc4z\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.193805 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mkf4\" (UniqueName: \"kubernetes.io/projected/7102918b-1c33-4b66-9767-fcf854b0f666-kube-api-access-8mkf4\") pod \"cinder-db-create-gcc4z\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.265895 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.276545 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fmjh\" (UniqueName: \"kubernetes.io/projected/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-kube-api-access-4fmjh\") pod \"cinder-0936-account-create-update-zlgpv\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.276712 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-operator-scripts\") pod \"cinder-0936-account-create-update-zlgpv\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.277384 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-operator-scripts\") pod \"cinder-0936-account-create-update-zlgpv\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.304050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fmjh\" (UniqueName: \"kubernetes.io/projected/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-kube-api-access-4fmjh\") pod \"cinder-0936-account-create-update-zlgpv\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.305146 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-76hml"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.307463 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.378779 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t72zl\" (UniqueName: \"kubernetes.io/projected/b004f31d-2432-403e-a862-a640cb1fe5ad-kube-api-access-t72zl\") pod \"barbican-db-create-76hml\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.379099 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b004f31d-2432-403e-a862-a640cb1fe5ad-operator-scripts\") pod \"barbican-db-create-76hml\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.383152 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-76hml"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.388387 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.448547 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-158b-account-create-update-7ht2q"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.449771 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.461886 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.482520 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t72zl\" (UniqueName: \"kubernetes.io/projected/b004f31d-2432-403e-a862-a640cb1fe5ad-kube-api-access-t72zl\") pod \"barbican-db-create-76hml\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.482562 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5be03d75-755c-40f4-a2f2-db8f9e99b082-operator-scripts\") pod \"barbican-158b-account-create-update-7ht2q\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.488032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvt4c\" (UniqueName: \"kubernetes.io/projected/5be03d75-755c-40f4-a2f2-db8f9e99b082-kube-api-access-gvt4c\") pod \"barbican-158b-account-create-update-7ht2q\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.488110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b004f31d-2432-403e-a862-a640cb1fe5ad-operator-scripts\") pod \"barbican-db-create-76hml\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.495693 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-158b-account-create-update-7ht2q"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.508487 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b004f31d-2432-403e-a862-a640cb1fe5ad-operator-scripts\") pod \"barbican-db-create-76hml\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.561789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t72zl\" (UniqueName: \"kubernetes.io/projected/b004f31d-2432-403e-a862-a640cb1fe5ad-kube-api-access-t72zl\") pod \"barbican-db-create-76hml\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.590055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5be03d75-755c-40f4-a2f2-db8f9e99b082-operator-scripts\") pod \"barbican-158b-account-create-update-7ht2q\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.590111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvt4c\" (UniqueName: \"kubernetes.io/projected/5be03d75-755c-40f4-a2f2-db8f9e99b082-kube-api-access-gvt4c\") pod \"barbican-158b-account-create-update-7ht2q\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.591507 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5be03d75-755c-40f4-a2f2-db8f9e99b082-operator-scripts\") pod \"barbican-158b-account-create-update-7ht2q\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.613216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvt4c\" (UniqueName: \"kubernetes.io/projected/5be03d75-755c-40f4-a2f2-db8f9e99b082-kube-api-access-gvt4c\") pod \"barbican-158b-account-create-update-7ht2q\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.676377 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-l68xw"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.677680 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.688381 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-l68xw"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.691966 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwqjf\" (UniqueName: \"kubernetes.io/projected/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-kube-api-access-bwqjf\") pod \"neutron-db-create-l68xw\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.692044 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-operator-scripts\") pod \"neutron-db-create-l68xw\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.774513 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ts667"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.775451 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.779689 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.779987 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.780143 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vv59s" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.780276 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.793364 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-config-data\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.793431 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwqjf\" (UniqueName: \"kubernetes.io/projected/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-kube-api-access-bwqjf\") pod \"neutron-db-create-l68xw\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.793453 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wcgk\" (UniqueName: \"kubernetes.io/projected/a94cb55c-878d-432f-ab95-4d0012359b2f-kube-api-access-2wcgk\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.793471 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-combined-ca-bundle\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.793532 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-operator-scripts\") pod \"neutron-db-create-l68xw\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.794420 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-operator-scripts\") pod \"neutron-db-create-l68xw\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.812814 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"eb45c8293018bd359128835928281f75572317a0d5f9e8993f454f9967a913ea"} Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.812853 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"db8879c1b733dd54268052df5777412e4e046d65345a043e48ed5c41ab208739"} Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.813041 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ts667"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.813415 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.813431 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-76hml" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.845818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwqjf\" (UniqueName: \"kubernetes.io/projected/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-kube-api-access-bwqjf\") pod \"neutron-db-create-l68xw\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.848531 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-99c4-account-create-update-6jrmv"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.851637 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.859063 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.879047 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-99c4-account-create-update-6jrmv"] Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.897553 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvx8\" (UniqueName: \"kubernetes.io/projected/c2ad5709-5849-49e7-840d-8af9abef7abd-kube-api-access-tsvx8\") pod \"neutron-99c4-account-create-update-6jrmv\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.897613 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wcgk\" (UniqueName: \"kubernetes.io/projected/a94cb55c-878d-432f-ab95-4d0012359b2f-kube-api-access-2wcgk\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.897641 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-combined-ca-bundle\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.897712 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ad5709-5849-49e7-840d-8af9abef7abd-operator-scripts\") pod \"neutron-99c4-account-create-update-6jrmv\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.897782 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-config-data\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.903453 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-combined-ca-bundle\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.916252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-config-data\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:43 crc kubenswrapper[4907]: I0226 16:03:43.922566 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wcgk\" (UniqueName: \"kubernetes.io/projected/a94cb55c-878d-432f-ab95-4d0012359b2f-kube-api-access-2wcgk\") pod \"keystone-db-sync-ts667\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.000666 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvx8\" (UniqueName: \"kubernetes.io/projected/c2ad5709-5849-49e7-840d-8af9abef7abd-kube-api-access-tsvx8\") pod \"neutron-99c4-account-create-update-6jrmv\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.000955 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ad5709-5849-49e7-840d-8af9abef7abd-operator-scripts\") pod \"neutron-99c4-account-create-update-6jrmv\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.001645 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ad5709-5849-49e7-840d-8af9abef7abd-operator-scripts\") pod \"neutron-99c4-account-create-update-6jrmv\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.008092 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.020069 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-0936-account-create-update-zlgpv"] Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.028241 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvx8\" (UniqueName: \"kubernetes.io/projected/c2ad5709-5849-49e7-840d-8af9abef7abd-kube-api-access-tsvx8\") pod \"neutron-99c4-account-create-update-6jrmv\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:44 crc kubenswrapper[4907]: W0226 16:03:44.055144 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf15ac55d_0e4b_46d0_9f5d_4e0e9b86e8fd.slice/crio-8713f41a6818f6602721134f4a7e3fbb4ae54a87187e771506d3556fffbe1d85 WatchSource:0}: Error finding container 8713f41a6818f6602721134f4a7e3fbb4ae54a87187e771506d3556fffbe1d85: Status 404 returned error can't find the container with id 8713f41a6818f6602721134f4a7e3fbb4ae54a87187e771506d3556fffbe1d85 Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.110346 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.173630 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gcc4z"] Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.207102 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.342943 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-158b-account-create-update-7ht2q"] Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.652644 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-76hml"] Feb 26 16:03:44 crc kubenswrapper[4907]: W0226 16:03:44.709873 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb004f31d_2432_403e_a862_a640cb1fe5ad.slice/crio-c5d8b250113acca4c509950507389b7c23b96e8b1332b5888798b4a3d6182a9b WatchSource:0}: Error finding container c5d8b250113acca4c509950507389b7c23b96e8b1332b5888798b4a3d6182a9b: Status 404 returned error can't find the container with id c5d8b250113acca4c509950507389b7c23b96e8b1332b5888798b4a3d6182a9b Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.841702 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gcc4z" event={"ID":"7102918b-1c33-4b66-9767-fcf854b0f666","Type":"ContainerStarted","Data":"2e0bdd95e7e82fcda77a054f31aca912cdc6ddd8c16dcde8fb981df930f683df"} Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.849761 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-76hml" event={"ID":"b004f31d-2432-403e-a862-a640cb1fe5ad","Type":"ContainerStarted","Data":"c5d8b250113acca4c509950507389b7c23b96e8b1332b5888798b4a3d6182a9b"} Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.861786 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-158b-account-create-update-7ht2q" event={"ID":"5be03d75-755c-40f4-a2f2-db8f9e99b082","Type":"ContainerStarted","Data":"9bb3193e9ac5e284a55de6f8bbb94299b787f0acc68d42eb7ad2ae25fb8c963f"} Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.866397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0936-account-create-update-zlgpv" event={"ID":"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd","Type":"ContainerStarted","Data":"8713f41a6818f6602721134f4a7e3fbb4ae54a87187e771506d3556fffbe1d85"} Feb 26 16:03:44 crc kubenswrapper[4907]: I0226 16:03:44.877987 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"758e276831798c6bf8a1c818282fc4a324aa0b45f67dfdf1fd84f5e10260edf7"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.080075 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-99c4-account-create-update-6jrmv"] Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.119492 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-l68xw"] Feb 26 16:03:45 crc kubenswrapper[4907]: W0226 16:03:45.127866 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ad5709_5849_49e7_840d_8af9abef7abd.slice/crio-b1bee0c31f6078d3bac5953b1bab04571120a71f150a07b24d8ffa6ed2cb7d7d WatchSource:0}: Error finding container b1bee0c31f6078d3bac5953b1bab04571120a71f150a07b24d8ffa6ed2cb7d7d: Status 404 returned error can't find the container with id b1bee0c31f6078d3bac5953b1bab04571120a71f150a07b24d8ffa6ed2cb7d7d Feb 26 16:03:45 crc kubenswrapper[4907]: W0226 16:03:45.134953 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb73e0ebd_2208_4fb9_9b3a_215c75b5529d.slice/crio-af9de239e19166ba912ab3e91f4af34f66741e1969e4e99a91f3def74781679b WatchSource:0}: Error finding container af9de239e19166ba912ab3e91f4af34f66741e1969e4e99a91f3def74781679b: Status 404 returned error can't find the container with id af9de239e19166ba912ab3e91f4af34f66741e1969e4e99a91f3def74781679b Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.343908 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ts667"] Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.889152 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l68xw" event={"ID":"b73e0ebd-2208-4fb9-9b3a-215c75b5529d","Type":"ContainerStarted","Data":"ee695ccda4a1b1f3bc05584ebde571beb2df00482ac669326b469f4ada49196a"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.889482 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l68xw" event={"ID":"b73e0ebd-2208-4fb9-9b3a-215c75b5529d","Type":"ContainerStarted","Data":"af9de239e19166ba912ab3e91f4af34f66741e1969e4e99a91f3def74781679b"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.892116 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-76hml" event={"ID":"b004f31d-2432-403e-a862-a640cb1fe5ad","Type":"ContainerStarted","Data":"276138739de3bc24bafc20c64d390be58e812ae9c834c069e28fd935a61d34f1"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.894544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-99c4-account-create-update-6jrmv" event={"ID":"c2ad5709-5849-49e7-840d-8af9abef7abd","Type":"ContainerStarted","Data":"5f933d2fb049836b0e3d0ab4080dcbda877f37d4f2ad6b0b4517870c399eabd7"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.894645 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-99c4-account-create-update-6jrmv" event={"ID":"c2ad5709-5849-49e7-840d-8af9abef7abd","Type":"ContainerStarted","Data":"b1bee0c31f6078d3bac5953b1bab04571120a71f150a07b24d8ffa6ed2cb7d7d"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.899724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-158b-account-create-update-7ht2q" event={"ID":"5be03d75-755c-40f4-a2f2-db8f9e99b082","Type":"ContainerStarted","Data":"6766438333efc1a4d8c6775d8147c22732f1b866f44239ba70742d36c98778fd"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.902076 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0936-account-create-update-zlgpv" event={"ID":"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd","Type":"ContainerStarted","Data":"bbf1de5304c8f42ea1279b391b8a34c5bbd6a2869bf612714dc862741f69423b"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.911969 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-l68xw" podStartSLOduration=2.911938823 podStartE2EDuration="2.911938823s" podCreationTimestamp="2026-02-26 16:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:45.907444885 +0000 UTC m=+1288.426006754" watchObservedRunningTime="2026-02-26 16:03:45.911938823 +0000 UTC m=+1288.430500672" Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.917472 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"c83747a6dfe74c27448d29476b785dfa28fc34738469adbfff9efea6f8e5cb76"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.917521 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"819c7fec-fd22-478a-bf6c-f4cb5aeccc59","Type":"ContainerStarted","Data":"965949b00606c0887a4a2ce60950a4bf7296082b5d9e9878912dd184665beec7"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.919880 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ts667" event={"ID":"a94cb55c-878d-432f-ab95-4d0012359b2f","Type":"ContainerStarted","Data":"e12b82d4982c27dfe410abf0b8326daca1b3e947dfd469bb290624e32f522b8f"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.921134 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gcc4z" event={"ID":"7102918b-1c33-4b66-9767-fcf854b0f666","Type":"ContainerStarted","Data":"dbd2eb33a1d2116ef8899a4f58d3fbd0214225f67626da56f54e85f5097397f7"} Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.937132 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-158b-account-create-update-7ht2q" podStartSLOduration=2.937112335 podStartE2EDuration="2.937112335s" podCreationTimestamp="2026-02-26 16:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:45.933256501 +0000 UTC m=+1288.451818340" watchObservedRunningTime="2026-02-26 16:03:45.937112335 +0000 UTC m=+1288.455674174" Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.955873 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-76hml" podStartSLOduration=2.955847069 podStartE2EDuration="2.955847069s" podCreationTimestamp="2026-02-26 16:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:45.94801049 +0000 UTC m=+1288.466572339" watchObservedRunningTime="2026-02-26 16:03:45.955847069 +0000 UTC m=+1288.474408918" Feb 26 16:03:45 crc kubenswrapper[4907]: I0226 16:03:45.981061 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-0936-account-create-update-zlgpv" podStartSLOduration=3.981039241 podStartE2EDuration="3.981039241s" podCreationTimestamp="2026-02-26 16:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:45.969946502 +0000 UTC m=+1288.488508371" watchObservedRunningTime="2026-02-26 16:03:45.981039241 +0000 UTC m=+1288.499601090" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.077500 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-99c4-account-create-update-6jrmv" podStartSLOduration=3.077478044 podStartE2EDuration="3.077478044s" podCreationTimestamp="2026-02-26 16:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:45.991926186 +0000 UTC m=+1288.510488045" watchObservedRunningTime="2026-02-26 16:03:46.077478044 +0000 UTC m=+1288.596039893" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.109173 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.180300784 podStartE2EDuration="48.109148303s" podCreationTimestamp="2026-02-26 16:02:58 +0000 UTC" firstStartedPulling="2026-02-26 16:03:33.24502784 +0000 UTC m=+1275.763589689" lastFinishedPulling="2026-02-26 16:03:42.173875359 +0000 UTC m=+1284.692437208" observedRunningTime="2026-02-26 16:03:46.081287836 +0000 UTC m=+1288.599849705" watchObservedRunningTime="2026-02-26 16:03:46.109148303 +0000 UTC m=+1288.627710152" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.391009 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-gcc4z" podStartSLOduration=4.390985298 podStartE2EDuration="4.390985298s" podCreationTimestamp="2026-02-26 16:03:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:46.114550204 +0000 UTC m=+1288.633112053" watchObservedRunningTime="2026-02-26 16:03:46.390985298 +0000 UTC m=+1288.909547147" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.394610 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-sjpsh"] Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.395810 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.400105 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.420708 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-sjpsh"] Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.472365 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.472739 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.472758 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.472782 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-config\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.472842 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdgrp\" (UniqueName: \"kubernetes.io/projected/42b623df-5cd7-43f8-bcb6-f17bc08e46de-kube-api-access-wdgrp\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.472871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.574171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.574237 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.574268 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-config\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.574357 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdgrp\" (UniqueName: \"kubernetes.io/projected/42b623df-5cd7-43f8-bcb6-f17bc08e46de-kube-api-access-wdgrp\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.574390 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.574415 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.575745 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.577852 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.577871 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.577967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.578154 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-config\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.594417 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdgrp\" (UniqueName: \"kubernetes.io/projected/42b623df-5cd7-43f8-bcb6-f17bc08e46de-kube-api-access-wdgrp\") pod \"dnsmasq-dns-5c79d794d7-sjpsh\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.713927 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.942915 4907 generic.go:334] "Generic (PLEG): container finished" podID="7102918b-1c33-4b66-9767-fcf854b0f666" containerID="dbd2eb33a1d2116ef8899a4f58d3fbd0214225f67626da56f54e85f5097397f7" exitCode=0 Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.943238 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gcc4z" event={"ID":"7102918b-1c33-4b66-9767-fcf854b0f666","Type":"ContainerDied","Data":"dbd2eb33a1d2116ef8899a4f58d3fbd0214225f67626da56f54e85f5097397f7"} Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.946878 4907 generic.go:334] "Generic (PLEG): container finished" podID="b73e0ebd-2208-4fb9-9b3a-215c75b5529d" containerID="ee695ccda4a1b1f3bc05584ebde571beb2df00482ac669326b469f4ada49196a" exitCode=0 Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.946924 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l68xw" event={"ID":"b73e0ebd-2208-4fb9-9b3a-215c75b5529d","Type":"ContainerDied","Data":"ee695ccda4a1b1f3bc05584ebde571beb2df00482ac669326b469f4ada49196a"} Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.952536 4907 generic.go:334] "Generic (PLEG): container finished" podID="b004f31d-2432-403e-a862-a640cb1fe5ad" containerID="276138739de3bc24bafc20c64d390be58e812ae9c834c069e28fd935a61d34f1" exitCode=0 Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.952630 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-76hml" event={"ID":"b004f31d-2432-403e-a862-a640cb1fe5ad","Type":"ContainerDied","Data":"276138739de3bc24bafc20c64d390be58e812ae9c834c069e28fd935a61d34f1"} Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.956697 4907 generic.go:334] "Generic (PLEG): container finished" podID="c2ad5709-5849-49e7-840d-8af9abef7abd" containerID="5f933d2fb049836b0e3d0ab4080dcbda877f37d4f2ad6b0b4517870c399eabd7" exitCode=0 Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.956757 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-99c4-account-create-update-6jrmv" event={"ID":"c2ad5709-5849-49e7-840d-8af9abef7abd","Type":"ContainerDied","Data":"5f933d2fb049836b0e3d0ab4080dcbda877f37d4f2ad6b0b4517870c399eabd7"} Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.968402 4907 generic.go:334] "Generic (PLEG): container finished" podID="5be03d75-755c-40f4-a2f2-db8f9e99b082" containerID="6766438333efc1a4d8c6775d8147c22732f1b866f44239ba70742d36c98778fd" exitCode=0 Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.968470 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-158b-account-create-update-7ht2q" event={"ID":"5be03d75-755c-40f4-a2f2-db8f9e99b082","Type":"ContainerDied","Data":"6766438333efc1a4d8c6775d8147c22732f1b866f44239ba70742d36c98778fd"} Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.996119 4907 generic.go:334] "Generic (PLEG): container finished" podID="f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" containerID="bbf1de5304c8f42ea1279b391b8a34c5bbd6a2869bf612714dc862741f69423b" exitCode=0 Feb 26 16:03:46 crc kubenswrapper[4907]: I0226 16:03:46.996557 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0936-account-create-update-zlgpv" event={"ID":"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd","Type":"ContainerDied","Data":"bbf1de5304c8f42ea1279b391b8a34c5bbd6a2869bf612714dc862741f69423b"} Feb 26 16:03:47 crc kubenswrapper[4907]: I0226 16:03:47.346964 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-sjpsh"] Feb 26 16:03:48 crc kubenswrapper[4907]: I0226 16:03:48.006115 4907 generic.go:334] "Generic (PLEG): container finished" podID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerID="b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96" exitCode=0 Feb 26 16:03:48 crc kubenswrapper[4907]: I0226 16:03:48.006175 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" event={"ID":"42b623df-5cd7-43f8-bcb6-f17bc08e46de","Type":"ContainerDied","Data":"b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96"} Feb 26 16:03:48 crc kubenswrapper[4907]: I0226 16:03:48.006377 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" event={"ID":"42b623df-5cd7-43f8-bcb6-f17bc08e46de","Type":"ContainerStarted","Data":"85004dad6c1829e1c7dcaaa92accfca0ed260ab04e9c49c53f346589db424b41"} Feb 26 16:03:50 crc kubenswrapper[4907]: I0226 16:03:50.964711 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:50 crc kubenswrapper[4907]: I0226 16:03:50.997478 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-76hml" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.003705 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.027293 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.052347 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.053500 4907 generic.go:334] "Generic (PLEG): container finished" podID="2395dfd1-7840-4703-a1c9-37c6eff664bd" containerID="63496136caf0de20beb55d60c8d05550ef8b6597390d822d2b0105cdf73169db" exitCode=0 Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.053628 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hdzvj" event={"ID":"2395dfd1-7840-4703-a1c9-37c6eff664bd","Type":"ContainerDied","Data":"63496136caf0de20beb55d60c8d05550ef8b6597390d822d2b0105cdf73169db"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.056020 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-0936-account-create-update-zlgpv" event={"ID":"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd","Type":"ContainerDied","Data":"8713f41a6818f6602721134f4a7e3fbb4ae54a87187e771506d3556fffbe1d85"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.056083 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8713f41a6818f6602721134f4a7e3fbb4ae54a87187e771506d3556fffbe1d85" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.056155 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-0936-account-create-update-zlgpv" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.061357 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.068000 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fmjh\" (UniqueName: \"kubernetes.io/projected/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-kube-api-access-4fmjh\") pod \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.068068 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-operator-scripts\") pod \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\" (UID: \"f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.069089 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" (UID: "f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.071125 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gcc4z" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.071460 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gcc4z" event={"ID":"7102918b-1c33-4b66-9767-fcf854b0f666","Type":"ContainerDied","Data":"2e0bdd95e7e82fcda77a054f31aca912cdc6ddd8c16dcde8fb981df930f683df"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.071490 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e0bdd95e7e82fcda77a054f31aca912cdc6ddd8c16dcde8fb981df930f683df" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.083581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-kube-api-access-4fmjh" (OuterVolumeSpecName: "kube-api-access-4fmjh") pod "f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" (UID: "f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd"). InnerVolumeSpecName "kube-api-access-4fmjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.083983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-l68xw" event={"ID":"b73e0ebd-2208-4fb9-9b3a-215c75b5529d","Type":"ContainerDied","Data":"af9de239e19166ba912ab3e91f4af34f66741e1969e4e99a91f3def74781679b"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.084017 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af9de239e19166ba912ab3e91f4af34f66741e1969e4e99a91f3def74781679b" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.084095 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-l68xw" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.089416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-76hml" event={"ID":"b004f31d-2432-403e-a862-a640cb1fe5ad","Type":"ContainerDied","Data":"c5d8b250113acca4c509950507389b7c23b96e8b1332b5888798b4a3d6182a9b"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.089461 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5d8b250113acca4c509950507389b7c23b96e8b1332b5888798b4a3d6182a9b" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.089544 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-76hml" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.092421 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-99c4-account-create-update-6jrmv" event={"ID":"c2ad5709-5849-49e7-840d-8af9abef7abd","Type":"ContainerDied","Data":"b1bee0c31f6078d3bac5953b1bab04571120a71f150a07b24d8ffa6ed2cb7d7d"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.092455 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bee0c31f6078d3bac5953b1bab04571120a71f150a07b24d8ffa6ed2cb7d7d" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.092505 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-99c4-account-create-update-6jrmv" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.117353 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-158b-account-create-update-7ht2q" event={"ID":"5be03d75-755c-40f4-a2f2-db8f9e99b082","Type":"ContainerDied","Data":"9bb3193e9ac5e284a55de6f8bbb94299b787f0acc68d42eb7ad2ae25fb8c963f"} Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.117406 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb3193e9ac5e284a55de6f8bbb94299b787f0acc68d42eb7ad2ae25fb8c963f" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.119509 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-158b-account-create-update-7ht2q" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174472 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t72zl\" (UniqueName: \"kubernetes.io/projected/b004f31d-2432-403e-a862-a640cb1fe5ad-kube-api-access-t72zl\") pod \"b004f31d-2432-403e-a862-a640cb1fe5ad\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-operator-scripts\") pod \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174610 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsvx8\" (UniqueName: \"kubernetes.io/projected/c2ad5709-5849-49e7-840d-8af9abef7abd-kube-api-access-tsvx8\") pod \"c2ad5709-5849-49e7-840d-8af9abef7abd\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174667 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7102918b-1c33-4b66-9767-fcf854b0f666-operator-scripts\") pod \"7102918b-1c33-4b66-9767-fcf854b0f666\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174723 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvt4c\" (UniqueName: \"kubernetes.io/projected/5be03d75-755c-40f4-a2f2-db8f9e99b082-kube-api-access-gvt4c\") pod \"5be03d75-755c-40f4-a2f2-db8f9e99b082\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b004f31d-2432-403e-a862-a640cb1fe5ad-operator-scripts\") pod \"b004f31d-2432-403e-a862-a640cb1fe5ad\" (UID: \"b004f31d-2432-403e-a862-a640cb1fe5ad\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174843 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5be03d75-755c-40f4-a2f2-db8f9e99b082-operator-scripts\") pod \"5be03d75-755c-40f4-a2f2-db8f9e99b082\" (UID: \"5be03d75-755c-40f4-a2f2-db8f9e99b082\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174887 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ad5709-5849-49e7-840d-8af9abef7abd-operator-scripts\") pod \"c2ad5709-5849-49e7-840d-8af9abef7abd\" (UID: \"c2ad5709-5849-49e7-840d-8af9abef7abd\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174919 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mkf4\" (UniqueName: \"kubernetes.io/projected/7102918b-1c33-4b66-9767-fcf854b0f666-kube-api-access-8mkf4\") pod \"7102918b-1c33-4b66-9767-fcf854b0f666\" (UID: \"7102918b-1c33-4b66-9767-fcf854b0f666\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.174966 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwqjf\" (UniqueName: \"kubernetes.io/projected/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-kube-api-access-bwqjf\") pod \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\" (UID: \"b73e0ebd-2208-4fb9-9b3a-215c75b5529d\") " Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.175028 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b73e0ebd-2208-4fb9-9b3a-215c75b5529d" (UID: "b73e0ebd-2208-4fb9-9b3a-215c75b5529d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.175396 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.175433 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fmjh\" (UniqueName: \"kubernetes.io/projected/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-kube-api-access-4fmjh\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.175448 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.175670 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7102918b-1c33-4b66-9767-fcf854b0f666-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7102918b-1c33-4b66-9767-fcf854b0f666" (UID: "7102918b-1c33-4b66-9767-fcf854b0f666"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.176000 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2ad5709-5849-49e7-840d-8af9abef7abd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c2ad5709-5849-49e7-840d-8af9abef7abd" (UID: "c2ad5709-5849-49e7-840d-8af9abef7abd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.176345 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5be03d75-755c-40f4-a2f2-db8f9e99b082-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5be03d75-755c-40f4-a2f2-db8f9e99b082" (UID: "5be03d75-755c-40f4-a2f2-db8f9e99b082"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.176525 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b004f31d-2432-403e-a862-a640cb1fe5ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b004f31d-2432-403e-a862-a640cb1fe5ad" (UID: "b004f31d-2432-403e-a862-a640cb1fe5ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.179389 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-kube-api-access-bwqjf" (OuterVolumeSpecName: "kube-api-access-bwqjf") pod "b73e0ebd-2208-4fb9-9b3a-215c75b5529d" (UID: "b73e0ebd-2208-4fb9-9b3a-215c75b5529d"). InnerVolumeSpecName "kube-api-access-bwqjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.179777 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2ad5709-5849-49e7-840d-8af9abef7abd-kube-api-access-tsvx8" (OuterVolumeSpecName: "kube-api-access-tsvx8") pod "c2ad5709-5849-49e7-840d-8af9abef7abd" (UID: "c2ad5709-5849-49e7-840d-8af9abef7abd"). InnerVolumeSpecName "kube-api-access-tsvx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.179793 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b004f31d-2432-403e-a862-a640cb1fe5ad-kube-api-access-t72zl" (OuterVolumeSpecName: "kube-api-access-t72zl") pod "b004f31d-2432-403e-a862-a640cb1fe5ad" (UID: "b004f31d-2432-403e-a862-a640cb1fe5ad"). InnerVolumeSpecName "kube-api-access-t72zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.180901 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7102918b-1c33-4b66-9767-fcf854b0f666-kube-api-access-8mkf4" (OuterVolumeSpecName: "kube-api-access-8mkf4") pod "7102918b-1c33-4b66-9767-fcf854b0f666" (UID: "7102918b-1c33-4b66-9767-fcf854b0f666"). InnerVolumeSpecName "kube-api-access-8mkf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.181474 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be03d75-755c-40f4-a2f2-db8f9e99b082-kube-api-access-gvt4c" (OuterVolumeSpecName: "kube-api-access-gvt4c") pod "5be03d75-755c-40f4-a2f2-db8f9e99b082" (UID: "5be03d75-755c-40f4-a2f2-db8f9e99b082"). InnerVolumeSpecName "kube-api-access-gvt4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.276972 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsvx8\" (UniqueName: \"kubernetes.io/projected/c2ad5709-5849-49e7-840d-8af9abef7abd-kube-api-access-tsvx8\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277008 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7102918b-1c33-4b66-9767-fcf854b0f666-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277107 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvt4c\" (UniqueName: \"kubernetes.io/projected/5be03d75-755c-40f4-a2f2-db8f9e99b082-kube-api-access-gvt4c\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277121 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b004f31d-2432-403e-a862-a640cb1fe5ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277133 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5be03d75-755c-40f4-a2f2-db8f9e99b082-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277142 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2ad5709-5849-49e7-840d-8af9abef7abd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277152 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mkf4\" (UniqueName: \"kubernetes.io/projected/7102918b-1c33-4b66-9767-fcf854b0f666-kube-api-access-8mkf4\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277164 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwqjf\" (UniqueName: \"kubernetes.io/projected/b73e0ebd-2208-4fb9-9b3a-215c75b5529d-kube-api-access-bwqjf\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:51 crc kubenswrapper[4907]: I0226 16:03:51.277176 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t72zl\" (UniqueName: \"kubernetes.io/projected/b004f31d-2432-403e-a862-a640cb1fe5ad-kube-api-access-t72zl\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.148301 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ts667" event={"ID":"a94cb55c-878d-432f-ab95-4d0012359b2f","Type":"ContainerStarted","Data":"2fd168ac0f799338bd20e6d5e4ed5ac9c085478239e19b247e17ab967f5d4bf7"} Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.153496 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" event={"ID":"42b623df-5cd7-43f8-bcb6-f17bc08e46de","Type":"ContainerStarted","Data":"e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f"} Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.151528 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ts667" podStartSLOduration=3.6933877219999998 podStartE2EDuration="9.151507539s" podCreationTimestamp="2026-02-26 16:03:43 +0000 UTC" firstStartedPulling="2026-02-26 16:03:45.388750377 +0000 UTC m=+1287.907312216" lastFinishedPulling="2026-02-26 16:03:50.846870174 +0000 UTC m=+1293.365432033" observedRunningTime="2026-02-26 16:03:52.14619458 +0000 UTC m=+1294.664756429" watchObservedRunningTime="2026-02-26 16:03:52.151507539 +0000 UTC m=+1294.670069388" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.191280 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" podStartSLOduration=6.191260595 podStartE2EDuration="6.191260595s" podCreationTimestamp="2026-02-26 16:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:52.176860545 +0000 UTC m=+1294.695422404" watchObservedRunningTime="2026-02-26 16:03:52.191260595 +0000 UTC m=+1294.709822444" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.554116 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.705898 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b55r\" (UniqueName: \"kubernetes.io/projected/2395dfd1-7840-4703-a1c9-37c6eff664bd-kube-api-access-6b55r\") pod \"2395dfd1-7840-4703-a1c9-37c6eff664bd\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.706269 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-combined-ca-bundle\") pod \"2395dfd1-7840-4703-a1c9-37c6eff664bd\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.706327 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-db-sync-config-data\") pod \"2395dfd1-7840-4703-a1c9-37c6eff664bd\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.706405 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-config-data\") pod \"2395dfd1-7840-4703-a1c9-37c6eff664bd\" (UID: \"2395dfd1-7840-4703-a1c9-37c6eff664bd\") " Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.713281 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2395dfd1-7840-4703-a1c9-37c6eff664bd-kube-api-access-6b55r" (OuterVolumeSpecName: "kube-api-access-6b55r") pod "2395dfd1-7840-4703-a1c9-37c6eff664bd" (UID: "2395dfd1-7840-4703-a1c9-37c6eff664bd"). InnerVolumeSpecName "kube-api-access-6b55r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.713441 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2395dfd1-7840-4703-a1c9-37c6eff664bd" (UID: "2395dfd1-7840-4703-a1c9-37c6eff664bd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.730081 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2395dfd1-7840-4703-a1c9-37c6eff664bd" (UID: "2395dfd1-7840-4703-a1c9-37c6eff664bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.754125 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-config-data" (OuterVolumeSpecName: "config-data") pod "2395dfd1-7840-4703-a1c9-37c6eff664bd" (UID: "2395dfd1-7840-4703-a1c9-37c6eff664bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.808212 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.808257 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.808270 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b55r\" (UniqueName: \"kubernetes.io/projected/2395dfd1-7840-4703-a1c9-37c6eff664bd-kube-api-access-6b55r\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:52 crc kubenswrapper[4907]: I0226 16:03:52.808287 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2395dfd1-7840-4703-a1c9-37c6eff664bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.139526 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hdzvj" event={"ID":"2395dfd1-7840-4703-a1c9-37c6eff664bd","Type":"ContainerDied","Data":"9a8a8a40d69c06551530c88809ac79085a257ca760af2d356b4ea3d8665b6a13"} Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.139616 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8a8a40d69c06551530c88809ac79085a257ca760af2d356b4ea3d8665b6a13" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.139657 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hdzvj" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.139987 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.442997 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-sjpsh"] Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478323 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vv9x4"] Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478696 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2395dfd1-7840-4703-a1c9-37c6eff664bd" containerName="glance-db-sync" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478717 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2395dfd1-7840-4703-a1c9-37c6eff664bd" containerName="glance-db-sync" Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478728 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2ad5709-5849-49e7-840d-8af9abef7abd" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478734 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2ad5709-5849-49e7-840d-8af9abef7abd" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478754 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73e0ebd-2208-4fb9-9b3a-215c75b5529d" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478762 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73e0ebd-2208-4fb9-9b3a-215c75b5529d" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478776 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b004f31d-2432-403e-a862-a640cb1fe5ad" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478781 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b004f31d-2432-403e-a862-a640cb1fe5ad" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478790 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be03d75-755c-40f4-a2f2-db8f9e99b082" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478795 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be03d75-755c-40f4-a2f2-db8f9e99b082" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478806 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7102918b-1c33-4b66-9767-fcf854b0f666" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478812 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7102918b-1c33-4b66-9767-fcf854b0f666" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: E0226 16:03:53.478821 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.478826 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479014 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be03d75-755c-40f4-a2f2-db8f9e99b082" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479033 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479041 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2ad5709-5849-49e7-840d-8af9abef7abd" containerName="mariadb-account-create-update" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479049 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b004f31d-2432-403e-a862-a640cb1fe5ad" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479058 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2395dfd1-7840-4703-a1c9-37c6eff664bd" containerName="glance-db-sync" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479064 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7102918b-1c33-4b66-9767-fcf854b0f666" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479075 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73e0ebd-2208-4fb9-9b3a-215c75b5529d" containerName="mariadb-database-create" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.479952 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.503998 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vv9x4"] Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.622960 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbksh\" (UniqueName: \"kubernetes.io/projected/6ee71975-a322-4ae3-99a4-7bd42e1d3761-kube-api-access-sbksh\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.623035 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.623182 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.623257 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.623359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-config\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.623434 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.724904 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.725206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbksh\" (UniqueName: \"kubernetes.io/projected/6ee71975-a322-4ae3-99a4-7bd42e1d3761-kube-api-access-sbksh\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.725514 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.725674 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.725871 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.726858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.726757 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.726843 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.725921 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.727090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-config\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.727828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-config\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.741915 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbksh\" (UniqueName: \"kubernetes.io/projected/6ee71975-a322-4ae3-99a4-7bd42e1d3761-kube-api-access-sbksh\") pod \"dnsmasq-dns-5f59b8f679-vv9x4\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:53 crc kubenswrapper[4907]: I0226 16:03:53.798902 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:54 crc kubenswrapper[4907]: I0226 16:03:54.337084 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vv9x4"] Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.155141 4907 generic.go:334] "Generic (PLEG): container finished" podID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerID="f004d1e9d1138c0ca971eb97cd4e506da42868a90cd20fdadcf95c8984027cee" exitCode=0 Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.155178 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" event={"ID":"6ee71975-a322-4ae3-99a4-7bd42e1d3761","Type":"ContainerDied","Data":"f004d1e9d1138c0ca971eb97cd4e506da42868a90cd20fdadcf95c8984027cee"} Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.155724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" event={"ID":"6ee71975-a322-4ae3-99a4-7bd42e1d3761","Type":"ContainerStarted","Data":"15d32426b88544a13ec1f417f8b0b13bf9a19aa7b8b55e57216f2a314016f994"} Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.156909 4907 generic.go:334] "Generic (PLEG): container finished" podID="a94cb55c-878d-432f-ab95-4d0012359b2f" containerID="2fd168ac0f799338bd20e6d5e4ed5ac9c085478239e19b247e17ab967f5d4bf7" exitCode=0 Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.156938 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ts667" event={"ID":"a94cb55c-878d-432f-ab95-4d0012359b2f","Type":"ContainerDied","Data":"2fd168ac0f799338bd20e6d5e4ed5ac9c085478239e19b247e17ab967f5d4bf7"} Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.157123 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerName="dnsmasq-dns" containerID="cri-o://e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f" gracePeriod=10 Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.605969 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.763069 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-swift-storage-0\") pod \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.763151 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-svc\") pod \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.763198 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-config\") pod \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.763239 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdgrp\" (UniqueName: \"kubernetes.io/projected/42b623df-5cd7-43f8-bcb6-f17bc08e46de-kube-api-access-wdgrp\") pod \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.763342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-nb\") pod \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.763421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-sb\") pod \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\" (UID: \"42b623df-5cd7-43f8-bcb6-f17bc08e46de\") " Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.771537 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42b623df-5cd7-43f8-bcb6-f17bc08e46de-kube-api-access-wdgrp" (OuterVolumeSpecName: "kube-api-access-wdgrp") pod "42b623df-5cd7-43f8-bcb6-f17bc08e46de" (UID: "42b623df-5cd7-43f8-bcb6-f17bc08e46de"). InnerVolumeSpecName "kube-api-access-wdgrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.807783 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42b623df-5cd7-43f8-bcb6-f17bc08e46de" (UID: "42b623df-5cd7-43f8-bcb6-f17bc08e46de"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.810097 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42b623df-5cd7-43f8-bcb6-f17bc08e46de" (UID: "42b623df-5cd7-43f8-bcb6-f17bc08e46de"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.815443 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42b623df-5cd7-43f8-bcb6-f17bc08e46de" (UID: "42b623df-5cd7-43f8-bcb6-f17bc08e46de"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.817155 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42b623df-5cd7-43f8-bcb6-f17bc08e46de" (UID: "42b623df-5cd7-43f8-bcb6-f17bc08e46de"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.818225 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-config" (OuterVolumeSpecName: "config") pod "42b623df-5cd7-43f8-bcb6-f17bc08e46de" (UID: "42b623df-5cd7-43f8-bcb6-f17bc08e46de"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.865232 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.865272 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.865285 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.865299 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.865311 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42b623df-5cd7-43f8-bcb6-f17bc08e46de-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:55 crc kubenswrapper[4907]: I0226 16:03:55.865322 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdgrp\" (UniqueName: \"kubernetes.io/projected/42b623df-5cd7-43f8-bcb6-f17bc08e46de-kube-api-access-wdgrp\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.182831 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" event={"ID":"6ee71975-a322-4ae3-99a4-7bd42e1d3761","Type":"ContainerStarted","Data":"aff3a9972a48788f25620bbcdf6cad2a75ed26bac2426071012e912c80cff3ab"} Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.183137 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.187635 4907 generic.go:334] "Generic (PLEG): container finished" podID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerID="e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f" exitCode=0 Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.187677 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.187757 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" event={"ID":"42b623df-5cd7-43f8-bcb6-f17bc08e46de","Type":"ContainerDied","Data":"e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f"} Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.187853 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-sjpsh" event={"ID":"42b623df-5cd7-43f8-bcb6-f17bc08e46de","Type":"ContainerDied","Data":"85004dad6c1829e1c7dcaaa92accfca0ed260ab04e9c49c53f346589db424b41"} Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.187879 4907 scope.go:117] "RemoveContainer" containerID="e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.205069 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" podStartSLOduration=3.205048014 podStartE2EDuration="3.205048014s" podCreationTimestamp="2026-02-26 16:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:03:56.198361375 +0000 UTC m=+1298.716923224" watchObservedRunningTime="2026-02-26 16:03:56.205048014 +0000 UTC m=+1298.723609863" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.242268 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-sjpsh"] Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.244500 4907 scope.go:117] "RemoveContainer" containerID="b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.248633 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-sjpsh"] Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.268662 4907 scope.go:117] "RemoveContainer" containerID="e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f" Feb 26 16:03:56 crc kubenswrapper[4907]: E0226 16:03:56.269226 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f\": container with ID starting with e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f not found: ID does not exist" containerID="e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.269387 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f"} err="failed to get container status \"e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f\": rpc error: code = NotFound desc = could not find container \"e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f\": container with ID starting with e072b415fa40de7ce96dfdcdab88ff1bc92ad3bb583be8452ee619b536dd800f not found: ID does not exist" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.269419 4907 scope.go:117] "RemoveContainer" containerID="b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96" Feb 26 16:03:56 crc kubenswrapper[4907]: E0226 16:03:56.269841 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96\": container with ID starting with b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96 not found: ID does not exist" containerID="b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.269866 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96"} err="failed to get container status \"b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96\": rpc error: code = NotFound desc = could not find container \"b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96\": container with ID starting with b5d78a991778db9fe9e4e2a64b2377b55a4e8f56e368d76b750c0d5162098c96 not found: ID does not exist" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.497996 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.676839 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-config-data\") pod \"a94cb55c-878d-432f-ab95-4d0012359b2f\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.677309 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wcgk\" (UniqueName: \"kubernetes.io/projected/a94cb55c-878d-432f-ab95-4d0012359b2f-kube-api-access-2wcgk\") pod \"a94cb55c-878d-432f-ab95-4d0012359b2f\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.678153 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-combined-ca-bundle\") pod \"a94cb55c-878d-432f-ab95-4d0012359b2f\" (UID: \"a94cb55c-878d-432f-ab95-4d0012359b2f\") " Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.682340 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94cb55c-878d-432f-ab95-4d0012359b2f-kube-api-access-2wcgk" (OuterVolumeSpecName: "kube-api-access-2wcgk") pod "a94cb55c-878d-432f-ab95-4d0012359b2f" (UID: "a94cb55c-878d-432f-ab95-4d0012359b2f"). InnerVolumeSpecName "kube-api-access-2wcgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.706322 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a94cb55c-878d-432f-ab95-4d0012359b2f" (UID: "a94cb55c-878d-432f-ab95-4d0012359b2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.735820 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-config-data" (OuterVolumeSpecName: "config-data") pod "a94cb55c-878d-432f-ab95-4d0012359b2f" (UID: "a94cb55c-878d-432f-ab95-4d0012359b2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.779865 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.780080 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a94cb55c-878d-432f-ab95-4d0012359b2f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:56 crc kubenswrapper[4907]: I0226 16:03:56.780150 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wcgk\" (UniqueName: \"kubernetes.io/projected/a94cb55c-878d-432f-ab95-4d0012359b2f-kube-api-access-2wcgk\") on node \"crc\" DevicePath \"\"" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.202868 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ts667" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.206034 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ts667" event={"ID":"a94cb55c-878d-432f-ab95-4d0012359b2f","Type":"ContainerDied","Data":"e12b82d4982c27dfe410abf0b8326daca1b3e947dfd469bb290624e32f522b8f"} Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.206169 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e12b82d4982c27dfe410abf0b8326daca1b3e947dfd469bb290624e32f522b8f" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.500012 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xvxcj"] Feb 26 16:03:57 crc kubenswrapper[4907]: E0226 16:03:57.500418 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerName="init" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.500438 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerName="init" Feb 26 16:03:57 crc kubenswrapper[4907]: E0226 16:03:57.500477 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94cb55c-878d-432f-ab95-4d0012359b2f" containerName="keystone-db-sync" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.500486 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94cb55c-878d-432f-ab95-4d0012359b2f" containerName="keystone-db-sync" Feb 26 16:03:57 crc kubenswrapper[4907]: E0226 16:03:57.500498 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerName="dnsmasq-dns" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.500507 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerName="dnsmasq-dns" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.500730 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" containerName="dnsmasq-dns" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.500755 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94cb55c-878d-432f-ab95-4d0012359b2f" containerName="keystone-db-sync" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.501397 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.504216 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.504434 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.504671 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.504780 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vv59s" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.505861 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.509649 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vv9x4"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.569472 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xvxcj"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.585761 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8lgxh"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.587180 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.593543 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-scripts\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.593695 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-credential-keys\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.593727 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-fernet-keys\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.593852 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-combined-ca-bundle\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.593964 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-config-data\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.594080 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt224\" (UniqueName: \"kubernetes.io/projected/b3fd641f-23d3-4d70-af64-66c3507eff49-kube-api-access-kt224\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.696765 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.696815 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-scripts\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.696843 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.696879 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-credential-keys\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.696912 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-fernet-keys\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-combined-ca-bundle\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697032 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697075 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-config-data\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697110 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-config\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r28gk\" (UniqueName: \"kubernetes.io/projected/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-kube-api-access-r28gk\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697154 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.697206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt224\" (UniqueName: \"kubernetes.io/projected/b3fd641f-23d3-4d70-af64-66c3507eff49-kube-api-access-kt224\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.701775 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-config-data\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.702486 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8lgxh"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.717923 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-fernet-keys\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.723412 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xvvbl"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.724819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.726183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-credential-keys\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.741344 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xvvbl"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.742065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-scripts\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.742262 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rm5vl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.742683 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-combined-ca-bundle\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.742267 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.742703 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.772707 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt224\" (UniqueName: \"kubernetes.io/projected/b3fd641f-23d3-4d70-af64-66c3507eff49-kube-api-access-kt224\") pod \"keystone-bootstrap-xvxcj\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.802914 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.803185 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.803371 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.803509 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-config\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.803624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r28gk\" (UniqueName: \"kubernetes.io/projected/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-kube-api-access-r28gk\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.803735 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.804713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.805371 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.805706 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.806115 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-config\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.806914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.817025 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.884921 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r28gk\" (UniqueName: \"kubernetes.io/projected/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-kube-api-access-r28gk\") pod \"dnsmasq-dns-bbf5cc879-8lgxh\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.907386 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.923380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcpk\" (UniqueName: \"kubernetes.io/projected/c98fd629-273b-4c87-a07c-4a482064a5a3-kube-api-access-sfcpk\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.923439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-db-sync-config-data\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.923505 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c98fd629-273b-4c87-a07c-4a482064a5a3-etc-machine-id\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.923541 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-scripts\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.923575 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-combined-ca-bundle\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.923657 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-config-data\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.971022 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sg95t"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.972569 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.979323 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.979560 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.986015 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sg95t"] Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.989243 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4ppm5" Feb 26 16:03:57 crc kubenswrapper[4907]: I0226 16:03:57.995000 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-785d56fd9c-lc7sg"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.000113 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.032688 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c98fd629-273b-4c87-a07c-4a482064a5a3-etc-machine-id\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.032842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-scripts\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.032946 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-combined-ca-bundle\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.033158 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-config-data\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.033472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcpk\" (UniqueName: \"kubernetes.io/projected/c98fd629-273b-4c87-a07c-4a482064a5a3-kube-api-access-sfcpk\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.033534 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-db-sync-config-data\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.032858 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c98fd629-273b-4c87-a07c-4a482064a5a3-etc-machine-id\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.037698 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.037934 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.037969 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-5x48r" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.038097 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.046278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-scripts\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.047837 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-db-sync-config-data\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.048370 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-config-data\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.112869 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-combined-ca-bundle\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.134633 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcpk\" (UniqueName: \"kubernetes.io/projected/c98fd629-273b-4c87-a07c-4a482064a5a3-kube-api-access-sfcpk\") pod \"cinder-db-sync-xvvbl\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.138963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwr5c\" (UniqueName: \"kubernetes.io/projected/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-kube-api-access-kwr5c\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139018 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-config\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139040 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3e0f652-e35c-49b8-abe3-9182b2026d08-horizon-secret-key\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139060 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e0f652-e35c-49b8-abe3-9182b2026d08-logs\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-combined-ca-bundle\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-scripts\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139178 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7r8q\" (UniqueName: \"kubernetes.io/projected/b3e0f652-e35c-49b8-abe3-9182b2026d08-kube-api-access-x7r8q\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.139195 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-config-data\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.209492 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b623df-5cd7-43f8-bcb6-f17bc08e46de" path="/var/lib/kubelet/pods/42b623df-5cd7-43f8-bcb6-f17bc08e46de/volumes" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.214068 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.220671 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-785d56fd9c-lc7sg"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.227848 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerName="dnsmasq-dns" containerID="cri-o://aff3a9972a48788f25620bbcdf6cad2a75ed26bac2426071012e912c80cff3ab" gracePeriod=10 Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245686 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwr5c\" (UniqueName: \"kubernetes.io/projected/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-kube-api-access-kwr5c\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245761 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-config\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245787 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3e0f652-e35c-49b8-abe3-9182b2026d08-horizon-secret-key\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245807 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e0f652-e35c-49b8-abe3-9182b2026d08-logs\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245824 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-combined-ca-bundle\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245877 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-scripts\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245961 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7r8q\" (UniqueName: \"kubernetes.io/projected/b3e0f652-e35c-49b8-abe3-9182b2026d08-kube-api-access-x7r8q\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.245979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-config-data\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.246940 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e0f652-e35c-49b8-abe3-9182b2026d08-logs\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.247449 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-config-data\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.252713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-scripts\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.257622 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.287666 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3e0f652-e35c-49b8-abe3-9182b2026d08-horizon-secret-key\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.312046 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.316296 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-config\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.330841 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-combined-ca-bundle\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.331090 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.332358 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.332649 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.360212 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwr5c\" (UniqueName: \"kubernetes.io/projected/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-kube-api-access-kwr5c\") pod \"neutron-db-sync-sg95t\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.402515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7r8q\" (UniqueName: \"kubernetes.io/projected/b3e0f652-e35c-49b8-abe3-9182b2026d08-kube-api-access-x7r8q\") pod \"horizon-785d56fd9c-lc7sg\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.407014 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-6t72w"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.408465 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.424349 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hcc7t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.424488 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.424559 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.436993 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sg95t" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-487fd\" (UniqueName: \"kubernetes.io/projected/429e4875-18c7-4a0a-bfea-135d7aec6ba0-kube-api-access-487fd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451572 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451725 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-scripts\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451835 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451850 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.451913 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-config-data\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.452006 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6t72w"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.526326 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.527089 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-766d888d6c-8sqt7"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.531010 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557349 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-logs\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557387 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557420 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-scripts\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557445 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-scripts\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557488 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln7sb\" (UniqueName: \"kubernetes.io/projected/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-kube-api-access-ln7sb\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557528 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557543 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557567 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557610 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-combined-ca-bundle\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557635 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-config-data\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557655 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-config-data\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.557691 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-487fd\" (UniqueName: \"kubernetes.io/projected/429e4875-18c7-4a0a-bfea-135d7aec6ba0-kube-api-access-487fd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.561183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-run-httpd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.562345 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-log-httpd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.563239 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.564405 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-slrvx"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.565450 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.578759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.581875 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fhpq9" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.583624 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-config-data\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.583801 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.591277 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-scripts\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.616643 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-slrvx"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.659417 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vdjh\" (UniqueName: \"kubernetes.io/projected/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-kube-api-access-9vdjh\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.659472 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln7sb\" (UniqueName: \"kubernetes.io/projected/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-kube-api-access-ln7sb\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.659495 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-config-data\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.659516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90cdbc73-317b-4479-9908-3712b34ce77d-horizon-secret-key\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.659532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cdbc73-317b-4479-9908-3712b34ce77d-logs\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.661810 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-scripts\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.661846 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwlhz\" (UniqueName: \"kubernetes.io/projected/90cdbc73-317b-4479-9908-3712b34ce77d-kube-api-access-jwlhz\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.661917 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-combined-ca-bundle\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.661949 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-config-data\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.661979 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-combined-ca-bundle\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.662000 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-db-sync-config-data\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.662044 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-logs\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.662092 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-scripts\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.671082 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-logs\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.673540 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-config-data\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.678013 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-scripts\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.688232 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-combined-ca-bundle\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.694332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-487fd\" (UniqueName: \"kubernetes.io/projected/429e4875-18c7-4a0a-bfea-135d7aec6ba0-kube-api-access-487fd\") pod \"ceilometer-0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.699703 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-766d888d6c-8sqt7"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.713172 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln7sb\" (UniqueName: \"kubernetes.io/projected/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-kube-api-access-ln7sb\") pod \"placement-db-sync-6t72w\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.765848 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-combined-ca-bundle\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.765888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-db-sync-config-data\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.769428 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vdjh\" (UniqueName: \"kubernetes.io/projected/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-kube-api-access-9vdjh\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.769496 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-config-data\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.769530 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90cdbc73-317b-4479-9908-3712b34ce77d-horizon-secret-key\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.769560 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cdbc73-317b-4479-9908-3712b34ce77d-logs\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.769639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-scripts\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.769661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwlhz\" (UniqueName: \"kubernetes.io/projected/90cdbc73-317b-4479-9908-3712b34ce77d-kube-api-access-jwlhz\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.770870 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cdbc73-317b-4479-9908-3712b34ce77d-logs\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.772743 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-combined-ca-bundle\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.773446 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-scripts\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.774100 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90cdbc73-317b-4479-9908-3712b34ce77d-horizon-secret-key\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.775864 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-config-data\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.786731 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-db-sync-config-data\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.833524 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8lgxh"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.834197 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwlhz\" (UniqueName: \"kubernetes.io/projected/90cdbc73-317b-4479-9908-3712b34ce77d-kube-api-access-jwlhz\") pod \"horizon-766d888d6c-8sqt7\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.836145 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vdjh\" (UniqueName: \"kubernetes.io/projected/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-kube-api-access-9vdjh\") pod \"barbican-db-sync-slrvx\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.859013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.883569 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t72w" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.939573 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.962798 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-slrvx" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.984876 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.986091 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.997044 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.997264 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 16:03:58 crc kubenswrapper[4907]: I0226 16:03:58.997405 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2tmh7" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.006498 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.017683 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:03:59 crc kubenswrapper[4907]: W0226 16:03:59.035924 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb7df1d1_6bd7_4b9b_a1ef_95722b7fc811.slice/crio-cd956dbc3cdc3b70dcbc9a3d9759fec1b5e03b4699a440e71e62d373724e760b WatchSource:0}: Error finding container cd956dbc3cdc3b70dcbc9a3d9759fec1b5e03b4699a440e71e62d373724e760b: Status 404 returned error can't find the container with id cd956dbc3cdc3b70dcbc9a3d9759fec1b5e03b4699a440e71e62d373724e760b Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.076681 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ssd6q"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.123456 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130229 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130263 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-logs\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130299 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130338 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8c7p\" (UniqueName: \"kubernetes.io/projected/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-kube-api-access-q8c7p\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130460 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.130510 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.235568 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.235989 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-config\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236069 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236192 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236323 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72x8p\" (UniqueName: \"kubernetes.io/projected/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-kube-api-access-72x8p\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236415 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236434 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236455 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236529 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-logs\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.236572 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.239332 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8c7p\" (UniqueName: \"kubernetes.io/projected/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-kube-api-access-q8c7p\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.261260 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.263387 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-config-data\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.264517 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-logs\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.300071 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8lgxh"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.342849 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.342902 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.342965 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72x8p\" (UniqueName: \"kubernetes.io/projected/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-kube-api-access-72x8p\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.346629 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.351026 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.354233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.370445 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" event={"ID":"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811","Type":"ContainerStarted","Data":"cd956dbc3cdc3b70dcbc9a3d9759fec1b5e03b4699a440e71e62d373724e760b"} Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.373040 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.376689 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-scripts\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.377806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.378827 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.379437 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.380249 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.380284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-config\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.380941 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-config\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.386700 4907 generic.go:334] "Generic (PLEG): container finished" podID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerID="aff3a9972a48788f25620bbcdf6cad2a75ed26bac2426071012e912c80cff3ab" exitCode=0 Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.386767 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" event={"ID":"6ee71975-a322-4ae3-99a4-7bd42e1d3761","Type":"ContainerDied","Data":"aff3a9972a48788f25620bbcdf6cad2a75ed26bac2426071012e912c80cff3ab"} Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.392424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.422493 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ssd6q"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.434775 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.435527 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8c7p\" (UniqueName: \"kubernetes.io/projected/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-kube-api-access-q8c7p\") pod \"glance-default-external-api-0\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.437882 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72x8p\" (UniqueName: \"kubernetes.io/projected/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-kube-api-access-72x8p\") pod \"dnsmasq-dns-56df8fb6b7-ssd6q\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.462514 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.494135 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.495662 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.504177 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.504715 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.556985 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.583925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.583976 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.584011 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.584072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.584149 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.584248 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-logs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.584273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.584370 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmcs\" (UniqueName: \"kubernetes.io/projected/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-kube-api-access-lzmcs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.624419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.642195 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xvxcj"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.651055 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xvvbl"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685521 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685576 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-logs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685613 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmcs\" (UniqueName: \"kubernetes.io/projected/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-kube-api-access-lzmcs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685705 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685720 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.685773 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.686020 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-logs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.686039 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.691884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.692947 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.697957 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.711521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.713081 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.735083 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-785d56fd9c-lc7sg"] Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.746517 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmcs\" (UniqueName: \"kubernetes.io/projected/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-kube-api-access-lzmcs\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.808113 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:03:59 crc kubenswrapper[4907]: I0226 16:03:59.896000 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.048334 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sg95t"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.094640 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.200973 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbksh\" (UniqueName: \"kubernetes.io/projected/6ee71975-a322-4ae3-99a4-7bd42e1d3761-kube-api-access-sbksh\") pod \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.201027 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-swift-storage-0\") pod \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.201066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-nb\") pod \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.201118 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-sb\") pod \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.201187 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-svc\") pod \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.201222 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-config\") pod \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\" (UID: \"6ee71975-a322-4ae3-99a4-7bd42e1d3761\") " Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.242453 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee71975-a322-4ae3-99a4-7bd42e1d3761-kube-api-access-sbksh" (OuterVolumeSpecName: "kube-api-access-sbksh") pod "6ee71975-a322-4ae3-99a4-7bd42e1d3761" (UID: "6ee71975-a322-4ae3-99a4-7bd42e1d3761"). InnerVolumeSpecName "kube-api-access-sbksh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.317696 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535364-t8qd8"] Feb 26 16:04:00 crc kubenswrapper[4907]: E0226 16:04:00.318281 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerName="init" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.318293 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerName="init" Feb 26 16:04:00 crc kubenswrapper[4907]: E0226 16:04:00.318323 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerName="dnsmasq-dns" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.318330 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerName="dnsmasq-dns" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.318483 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" containerName="dnsmasq-dns" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.318936 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-t8qd8"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.319008 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.323932 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbksh\" (UniqueName: \"kubernetes.io/projected/6ee71975-a322-4ae3-99a4-7bd42e1d3761-kube-api-access-sbksh\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.342757 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.344363 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.345062 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.378746 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.443555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq5sf\" (UniqueName: \"kubernetes.io/projected/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d-kube-api-access-qq5sf\") pod \"auto-csr-approver-29535364-t8qd8\" (UID: \"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d\") " pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.462675 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-config" (OuterVolumeSpecName: "config") pod "6ee71975-a322-4ae3-99a4-7bd42e1d3761" (UID: "6ee71975-a322-4ae3-99a4-7bd42e1d3761"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.490497 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-6t72w"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.493334 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xvvbl" event={"ID":"c98fd629-273b-4c87-a07c-4a482064a5a3","Type":"ContainerStarted","Data":"b308d5b26b10adbd8d37d83e53aa48b2e59f9e627577cd43ab18b13bfa2bc4b7"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.510433 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xvxcj" event={"ID":"b3fd641f-23d3-4d70-af64-66c3507eff49","Type":"ContainerStarted","Data":"30b2bb90b711626ce57caa8880e3ecc1df500c89c700220a73f326eac4fdd679"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.510653 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xvxcj" event={"ID":"b3fd641f-23d3-4d70-af64-66c3507eff49","Type":"ContainerStarted","Data":"84274f44171aedec815da71a440daf133b170f3f06ef92c7ed15048c2be812b3"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.513090 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-785d56fd9c-lc7sg" event={"ID":"b3e0f652-e35c-49b8-abe3-9182b2026d08","Type":"ContainerStarted","Data":"24da7506b0b6159b838185bb5fca55fe31b5e5ab8b7a0bc24800b8ac9138d3a8"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.515659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" event={"ID":"6ee71975-a322-4ae3-99a4-7bd42e1d3761","Type":"ContainerDied","Data":"15d32426b88544a13ec1f417f8b0b13bf9a19aa7b8b55e57216f2a314016f994"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.515719 4907 scope.go:117] "RemoveContainer" containerID="aff3a9972a48788f25620bbcdf6cad2a75ed26bac2426071012e912c80cff3ab" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.515835 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-vv9x4" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.542037 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6ee71975-a322-4ae3-99a4-7bd42e1d3761" (UID: "6ee71975-a322-4ae3-99a4-7bd42e1d3761"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.542369 4907 generic.go:334] "Generic (PLEG): container finished" podID="bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" containerID="e9f9457b9ee191495b288e955616f2f4c19de566bd49ca309f6954246ee85e78" exitCode=0 Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.542455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" event={"ID":"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811","Type":"ContainerDied","Data":"e9f9457b9ee191495b288e955616f2f4c19de566bd49ca309f6954246ee85e78"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.545166 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq5sf\" (UniqueName: \"kubernetes.io/projected/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d-kube-api-access-qq5sf\") pod \"auto-csr-approver-29535364-t8qd8\" (UID: \"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d\") " pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.546197 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.546376 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.553151 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sg95t" event={"ID":"1ae29e7c-7f4a-492f-b10d-2badd4d606aa","Type":"ContainerStarted","Data":"9833f807c8825f42dda8377b7673fa0cbcdf5e3409f919fa6a013360f910b6df"} Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.575747 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xvxcj" podStartSLOduration=3.575727234 podStartE2EDuration="3.575727234s" podCreationTimestamp="2026-02-26 16:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:00.55127125 +0000 UTC m=+1303.069833109" watchObservedRunningTime="2026-02-26 16:04:00.575727234 +0000 UTC m=+1303.094289083" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.584053 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6ee71975-a322-4ae3-99a4-7bd42e1d3761" (UID: "6ee71975-a322-4ae3-99a4-7bd42e1d3761"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.601332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq5sf\" (UniqueName: \"kubernetes.io/projected/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d-kube-api-access-qq5sf\") pod \"auto-csr-approver-29535364-t8qd8\" (UID: \"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d\") " pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.616955 4907 scope.go:117] "RemoveContainer" containerID="f004d1e9d1138c0ca971eb97cd4e506da42868a90cd20fdadcf95c8984027cee" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.634686 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6ee71975-a322-4ae3-99a4-7bd42e1d3761" (UID: "6ee71975-a322-4ae3-99a4-7bd42e1d3761"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.637383 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ee71975-a322-4ae3-99a4-7bd42e1d3761" (UID: "6ee71975-a322-4ae3-99a4-7bd42e1d3761"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.647736 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.648613 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.648631 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6ee71975-a322-4ae3-99a4-7bd42e1d3761-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.660522 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-slrvx"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.716211 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-766d888d6c-8sqt7"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.852060 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.939562 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vv9x4"] Feb 26 16:04:00 crc kubenswrapper[4907]: I0226 16:04:00.986669 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-vv9x4"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.026130 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ssd6q"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.110541 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.257324 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.337208 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-785d56fd9c-lc7sg"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.433987 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.524702 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-594d447db9-7p2nh"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.530945 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.581299 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.623514 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594d447db9-7p2nh"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.689619 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-svc\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.689700 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-sb\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.689845 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r28gk\" (UniqueName: \"kubernetes.io/projected/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-kube-api-access-r28gk\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.689891 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-swift-storage-0\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.689906 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-config\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.689976 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.690205 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgdsk\" (UniqueName: \"kubernetes.io/projected/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-kube-api-access-pgdsk\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.690226 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-logs\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.690267 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-scripts\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.690310 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-config-data\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.690353 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-horizon-secret-key\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.696419 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.714936 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sg95t" event={"ID":"1ae29e7c-7f4a-492f-b10d-2badd4d606aa","Type":"ContainerStarted","Data":"7d8384c340f47ec5e939b3ead6a4f8659accead8c6b60732d40de8114e9324a0"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.768959 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-kube-api-access-r28gk" (OuterVolumeSpecName: "kube-api-access-r28gk") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "kube-api-access-r28gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.792179 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-horizon-secret-key\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.804981 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgdsk\" (UniqueName: \"kubernetes.io/projected/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-kube-api-access-pgdsk\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.805235 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-logs\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.805420 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-scripts\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.805583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-config-data\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.805760 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r28gk\" (UniqueName: \"kubernetes.io/projected/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-kube-api-access-r28gk\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.806992 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-config-data\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.807521 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-logs\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.807639 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-horizon-secret-key\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.807547 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerStarted","Data":"0457368ebf2e749dc65e07a1276373b2f070582382c01ef1f135f773ce5e14af"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.808165 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-scripts\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.844212 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-slrvx" event={"ID":"a02d2622-77ed-4949-95b5-4f5ae5f1c47d","Type":"ContainerStarted","Data":"7f174eb188a2d21ce5510fcc0be89fb379aa859301ab35fd121402c3359f91d8"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.846892 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.865828 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec","Type":"ContainerStarted","Data":"078e6ff5dcf430000c862e73512ee8d0b1ced25c0ce25a6433cd23e2f06aba2a"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.866754 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgdsk\" (UniqueName: \"kubernetes.io/projected/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-kube-api-access-pgdsk\") pod \"horizon-594d447db9-7p2nh\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.871959 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.900117 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.902610 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-config" (OuterVolumeSpecName: "config") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.902959 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.906007 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" event={"ID":"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2","Type":"ContainerStarted","Data":"2f8ff938b8ca8578b1c2010cd415367806c641cbf907da82b020cb3994b6d420"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.922141 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.927996 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb\") pod \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\" (UID: \"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811\") " Feb 26 16:04:01 crc kubenswrapper[4907]: W0226 16:04:01.928143 4907 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811/volumes/kubernetes.io~configmap/ovsdbserver-nb Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.928167 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" (UID: "bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.928859 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.928872 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.928882 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.928892 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.936080 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sg95t" podStartSLOduration=4.936057367 podStartE2EDuration="4.936057367s" podCreationTimestamp="2026-02-26 16:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:01.803176307 +0000 UTC m=+1304.321738156" watchObservedRunningTime="2026-02-26 16:04:01.936057367 +0000 UTC m=+1304.454619216" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.959297 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" event={"ID":"bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811","Type":"ContainerDied","Data":"cd956dbc3cdc3b70dcbc9a3d9759fec1b5e03b4699a440e71e62d373724e760b"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.959343 4907 scope.go:117] "RemoveContainer" containerID="e9f9457b9ee191495b288e955616f2f4c19de566bd49ca309f6954246ee85e78" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.959488 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-8lgxh" Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.980261 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766d888d6c-8sqt7" event={"ID":"90cdbc73-317b-4479-9908-3712b34ce77d","Type":"ContainerStarted","Data":"01aa70d164b3641fb0d01a8d3c7102f4464476aa5a1d213ad74aaf6ade5a2d06"} Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.980576 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:01 crc kubenswrapper[4907]: I0226 16:04:01.997073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t72w" event={"ID":"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a","Type":"ContainerStarted","Data":"4e5c036af10db7b9bb532574863bd6a6278d5a6d948e407606249b2921e9eb0b"} Feb 26 16:04:02 crc kubenswrapper[4907]: I0226 16:04:02.031923 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:02 crc kubenswrapper[4907]: I0226 16:04:02.105659 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8lgxh"] Feb 26 16:04:02 crc kubenswrapper[4907]: I0226 16:04:02.159503 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee71975-a322-4ae3-99a4-7bd42e1d3761" path="/var/lib/kubelet/pods/6ee71975-a322-4ae3-99a4-7bd42e1d3761/volumes" Feb 26 16:04:02 crc kubenswrapper[4907]: I0226 16:04:02.160457 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-8lgxh"] Feb 26 16:04:02 crc kubenswrapper[4907]: I0226 16:04:02.406134 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-t8qd8"] Feb 26 16:04:02 crc kubenswrapper[4907]: I0226 16:04:02.951482 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-594d447db9-7p2nh"] Feb 26 16:04:02 crc kubenswrapper[4907]: W0226 16:04:02.978416 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb591cc9e_aa47_48dc_9462_a54cd3bbbaa8.slice/crio-fcaf6df33fe71f9f9fcd282abdc05a69c3e4a21483a529728e0ec777762ac0ce WatchSource:0}: Error finding container fcaf6df33fe71f9f9fcd282abdc05a69c3e4a21483a529728e0ec777762ac0ce: Status 404 returned error can't find the container with id fcaf6df33fe71f9f9fcd282abdc05a69c3e4a21483a529728e0ec777762ac0ce Feb 26 16:04:03 crc kubenswrapper[4907]: I0226 16:04:03.044798 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f","Type":"ContainerStarted","Data":"d0f0be26976f6f288e2853be7e4244ef66b7979305a75f5e0b36739868b17bea"} Feb 26 16:04:03 crc kubenswrapper[4907]: I0226 16:04:03.056659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594d447db9-7p2nh" event={"ID":"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8","Type":"ContainerStarted","Data":"fcaf6df33fe71f9f9fcd282abdc05a69c3e4a21483a529728e0ec777762ac0ce"} Feb 26 16:04:03 crc kubenswrapper[4907]: I0226 16:04:03.059623 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" event={"ID":"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2","Type":"ContainerDied","Data":"39335f4f2b14533ae9264b2cac3796ab9c192b8b3084a213715ab7bd87a34764"} Feb 26 16:04:03 crc kubenswrapper[4907]: I0226 16:04:03.059660 4907 generic.go:334] "Generic (PLEG): container finished" podID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerID="39335f4f2b14533ae9264b2cac3796ab9c192b8b3084a213715ab7bd87a34764" exitCode=0 Feb 26 16:04:03 crc kubenswrapper[4907]: I0226 16:04:03.066027 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" event={"ID":"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d","Type":"ContainerStarted","Data":"04cf445c772d52e3c125c160efb7a092e10254560adce75616a80cbb4e4b2416"} Feb 26 16:04:04 crc kubenswrapper[4907]: I0226 16:04:04.090686 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" event={"ID":"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2","Type":"ContainerStarted","Data":"2f31d6369311bd67b490188786d5fc486c8f23a5573ac8fb19049224a8024306"} Feb 26 16:04:04 crc kubenswrapper[4907]: I0226 16:04:04.091558 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:04:04 crc kubenswrapper[4907]: I0226 16:04:04.104575 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f","Type":"ContainerStarted","Data":"bbf40f72132b8463f042e5cbd6f1edf0663e56f8654c532a459a427e7d565513"} Feb 26 16:04:04 crc kubenswrapper[4907]: I0226 16:04:04.107394 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec","Type":"ContainerStarted","Data":"f475f7cfb1e058af1119bda095b6f8623f43674bb4901a8294f0bf24d8e55701"} Feb 26 16:04:04 crc kubenswrapper[4907]: I0226 16:04:04.119534 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" podStartSLOduration=6.119518006 podStartE2EDuration="6.119518006s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:04.117732764 +0000 UTC m=+1306.636294623" watchObservedRunningTime="2026-02-26 16:04:04.119518006 +0000 UTC m=+1306.638079845" Feb 26 16:04:04 crc kubenswrapper[4907]: I0226 16:04:04.143784 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" path="/var/lib/kubelet/pods/bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811/volumes" Feb 26 16:04:05 crc kubenswrapper[4907]: I0226 16:04:05.187564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" event={"ID":"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d","Type":"ContainerStarted","Data":"a1c12bb904185e9dd91c784f19d207ab94fd40089de7556c864ea01a85875ce2"} Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.214119 4907 generic.go:334] "Generic (PLEG): container finished" podID="b2b66b18-ac41-4d84-9ae1-5900c27d0d7d" containerID="a1c12bb904185e9dd91c784f19d207ab94fd40089de7556c864ea01a85875ce2" exitCode=0 Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.214457 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" event={"ID":"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d","Type":"ContainerDied","Data":"a1c12bb904185e9dd91c784f19d207ab94fd40089de7556c864ea01a85875ce2"} Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.226546 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f","Type":"ContainerStarted","Data":"a7aa80c5dd949f99eb0f35cfc3f25c43599eeac522fc68406f7eda1a501c7bcf"} Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.226743 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-log" containerID="cri-o://bbf40f72132b8463f042e5cbd6f1edf0663e56f8654c532a459a427e7d565513" gracePeriod=30 Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.227959 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-httpd" containerID="cri-o://a7aa80c5dd949f99eb0f35cfc3f25c43599eeac522fc68406f7eda1a501c7bcf" gracePeriod=30 Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.243216 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec","Type":"ContainerStarted","Data":"8ee41a2cda68419d1d2f0335fc3443029fa9e5671b0a1f151d84918008aee590"} Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.243325 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-log" containerID="cri-o://f475f7cfb1e058af1119bda095b6f8623f43674bb4901a8294f0bf24d8e55701" gracePeriod=30 Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.243344 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-httpd" containerID="cri-o://8ee41a2cda68419d1d2f0335fc3443029fa9e5671b0a1f151d84918008aee590" gracePeriod=30 Feb 26 16:04:06 crc kubenswrapper[4907]: I0226 16:04:06.321833 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.321810815 podStartE2EDuration="8.321810815s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:06.278562943 +0000 UTC m=+1308.797124812" watchObservedRunningTime="2026-02-26 16:04:06.321810815 +0000 UTC m=+1308.840372664" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.309894 4907 generic.go:334] "Generic (PLEG): container finished" podID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerID="a7aa80c5dd949f99eb0f35cfc3f25c43599eeac522fc68406f7eda1a501c7bcf" exitCode=0 Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.310215 4907 generic.go:334] "Generic (PLEG): container finished" podID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerID="bbf40f72132b8463f042e5cbd6f1edf0663e56f8654c532a459a427e7d565513" exitCode=143 Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.310289 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f","Type":"ContainerDied","Data":"a7aa80c5dd949f99eb0f35cfc3f25c43599eeac522fc68406f7eda1a501c7bcf"} Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.310319 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f","Type":"ContainerDied","Data":"bbf40f72132b8463f042e5cbd6f1edf0663e56f8654c532a459a427e7d565513"} Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.311401 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.311386252 podStartE2EDuration="9.311386252s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:06.314754306 +0000 UTC m=+1308.833316175" watchObservedRunningTime="2026-02-26 16:04:07.311386252 +0000 UTC m=+1309.829948101" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.329112 4907 generic.go:334] "Generic (PLEG): container finished" podID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerID="8ee41a2cda68419d1d2f0335fc3443029fa9e5671b0a1f151d84918008aee590" exitCode=0 Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.329152 4907 generic.go:334] "Generic (PLEG): container finished" podID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerID="f475f7cfb1e058af1119bda095b6f8623f43674bb4901a8294f0bf24d8e55701" exitCode=143 Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.329341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec","Type":"ContainerDied","Data":"8ee41a2cda68419d1d2f0335fc3443029fa9e5671b0a1f151d84918008aee590"} Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.329375 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec","Type":"ContainerDied","Data":"f475f7cfb1e058af1119bda095b6f8623f43674bb4901a8294f0bf24d8e55701"} Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.330444 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-766d888d6c-8sqt7"] Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.379241 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fccfb8496-4tqhr"] Feb 26 16:04:07 crc kubenswrapper[4907]: E0226 16:04:07.380391 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" containerName="init" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.383335 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" containerName="init" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.383873 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb7df1d1-6bd7-4b9b-a1ef-95722b7fc811" containerName="init" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.389741 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.404909 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.405606 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fccfb8496-4tqhr"] Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484497 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-tls-certs\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484601 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-config-data\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484631 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-combined-ca-bundle\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484662 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-secret-key\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484687 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911d5df8-d8e2-4552-9c75-33c5ab72646b-logs\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484717 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-scripts\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.484762 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgd7r\" (UniqueName: \"kubernetes.io/projected/911d5df8-d8e2-4552-9c75-33c5ab72646b-kube-api-access-fgd7r\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.538109 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594d447db9-7p2nh"] Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586189 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgd7r\" (UniqueName: \"kubernetes.io/projected/911d5df8-d8e2-4552-9c75-33c5ab72646b-kube-api-access-fgd7r\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586281 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-tls-certs\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586422 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-config-data\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586462 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-combined-ca-bundle\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586519 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-secret-key\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586553 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911d5df8-d8e2-4552-9c75-33c5ab72646b-logs\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.586630 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-scripts\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.587547 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-scripts\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.588121 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-config-data\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.594701 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911d5df8-d8e2-4552-9c75-33c5ab72646b-logs\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.595792 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76d88967b8-wmzcw"] Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.603110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-combined-ca-bundle\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.603938 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-tls-certs\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.605506 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.645127 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76d88967b8-wmzcw"] Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.651235 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-secret-key\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.662376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgd7r\" (UniqueName: \"kubernetes.io/projected/911d5df8-d8e2-4552-9c75-33c5ab72646b-kube-api-access-fgd7r\") pod \"horizon-6fccfb8496-4tqhr\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.747315 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789269 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-horizon-secret-key\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-combined-ca-bundle\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789354 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c58ph\" (UniqueName: \"kubernetes.io/projected/b35f87c4-e535-4901-8814-0b321b201158-kube-api-access-c58ph\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b35f87c4-e535-4901-8814-0b321b201158-logs\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789409 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-horizon-tls-certs\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789437 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b35f87c4-e535-4901-8814-0b321b201158-scripts\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.789478 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b35f87c4-e535-4901-8814-0b321b201158-config-data\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899479 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b35f87c4-e535-4901-8814-0b321b201158-config-data\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899608 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-horizon-secret-key\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899654 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-combined-ca-bundle\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c58ph\" (UniqueName: \"kubernetes.io/projected/b35f87c4-e535-4901-8814-0b321b201158-kube-api-access-c58ph\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899749 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b35f87c4-e535-4901-8814-0b321b201158-logs\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-horizon-tls-certs\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.899826 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b35f87c4-e535-4901-8814-0b321b201158-scripts\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.900681 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b35f87c4-e535-4901-8814-0b321b201158-scripts\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.905956 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b35f87c4-e535-4901-8814-0b321b201158-logs\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.905650 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b35f87c4-e535-4901-8814-0b321b201158-config-data\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.919053 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-horizon-secret-key\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.956514 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c58ph\" (UniqueName: \"kubernetes.io/projected/b35f87c4-e535-4901-8814-0b321b201158-kube-api-access-c58ph\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.957278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-combined-ca-bundle\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:07 crc kubenswrapper[4907]: I0226 16:04:07.963334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35f87c4-e535-4901-8814-0b321b201158-horizon-tls-certs\") pod \"horizon-76d88967b8-wmzcw\" (UID: \"b35f87c4-e535-4901-8814-0b321b201158\") " pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:08 crc kubenswrapper[4907]: I0226 16:04:08.153526 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:09 crc kubenswrapper[4907]: I0226 16:04:09.464737 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:04:09 crc kubenswrapper[4907]: I0226 16:04:09.551498 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xx2fj"] Feb 26 16:04:09 crc kubenswrapper[4907]: I0226 16:04:09.558437 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" containerID="cri-o://c7b7197fc16c5531ccf4f45093fd0c8f8d3d99749cd680b025e8890044887cce" gracePeriod=10 Feb 26 16:04:10 crc kubenswrapper[4907]: I0226 16:04:10.436539 4907 generic.go:334] "Generic (PLEG): container finished" podID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerID="c7b7197fc16c5531ccf4f45093fd0c8f8d3d99749cd680b025e8890044887cce" exitCode=0 Feb 26 16:04:10 crc kubenswrapper[4907]: I0226 16:04:10.436637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" event={"ID":"62d9c258-3e92-48cc-a4b2-7207c93a6346","Type":"ContainerDied","Data":"c7b7197fc16c5531ccf4f45093fd0c8f8d3d99749cd680b025e8890044887cce"} Feb 26 16:04:11 crc kubenswrapper[4907]: I0226 16:04:11.459648 4907 generic.go:334] "Generic (PLEG): container finished" podID="b3fd641f-23d3-4d70-af64-66c3507eff49" containerID="30b2bb90b711626ce57caa8880e3ecc1df500c89c700220a73f326eac4fdd679" exitCode=0 Feb 26 16:04:11 crc kubenswrapper[4907]: I0226 16:04:11.459689 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xvxcj" event={"ID":"b3fd641f-23d3-4d70-af64-66c3507eff49","Type":"ContainerDied","Data":"30b2bb90b711626ce57caa8880e3ecc1df500c89c700220a73f326eac4fdd679"} Feb 26 16:04:13 crc kubenswrapper[4907]: I0226 16:04:13.665163 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.349040 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.435204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-fernet-keys\") pod \"b3fd641f-23d3-4d70-af64-66c3507eff49\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.435282 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-config-data\") pod \"b3fd641f-23d3-4d70-af64-66c3507eff49\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.435357 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-combined-ca-bundle\") pod \"b3fd641f-23d3-4d70-af64-66c3507eff49\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.435411 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-credential-keys\") pod \"b3fd641f-23d3-4d70-af64-66c3507eff49\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.435433 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-scripts\") pod \"b3fd641f-23d3-4d70-af64-66c3507eff49\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.435465 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt224\" (UniqueName: \"kubernetes.io/projected/b3fd641f-23d3-4d70-af64-66c3507eff49-kube-api-access-kt224\") pod \"b3fd641f-23d3-4d70-af64-66c3507eff49\" (UID: \"b3fd641f-23d3-4d70-af64-66c3507eff49\") " Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.443101 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-scripts" (OuterVolumeSpecName: "scripts") pod "b3fd641f-23d3-4d70-af64-66c3507eff49" (UID: "b3fd641f-23d3-4d70-af64-66c3507eff49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.443283 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b3fd641f-23d3-4d70-af64-66c3507eff49" (UID: "b3fd641f-23d3-4d70-af64-66c3507eff49"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.447143 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3fd641f-23d3-4d70-af64-66c3507eff49-kube-api-access-kt224" (OuterVolumeSpecName: "kube-api-access-kt224") pod "b3fd641f-23d3-4d70-af64-66c3507eff49" (UID: "b3fd641f-23d3-4d70-af64-66c3507eff49"). InnerVolumeSpecName "kube-api-access-kt224". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.449234 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b3fd641f-23d3-4d70-af64-66c3507eff49" (UID: "b3fd641f-23d3-4d70-af64-66c3507eff49"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.461326 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-config-data" (OuterVolumeSpecName: "config-data") pod "b3fd641f-23d3-4d70-af64-66c3507eff49" (UID: "b3fd641f-23d3-4d70-af64-66c3507eff49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.523681 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xvxcj" event={"ID":"b3fd641f-23d3-4d70-af64-66c3507eff49","Type":"ContainerDied","Data":"84274f44171aedec815da71a440daf133b170f3f06ef92c7ed15048c2be812b3"} Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.523722 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84274f44171aedec815da71a440daf133b170f3f06ef92c7ed15048c2be812b3" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.523817 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xvxcj" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.537256 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt224\" (UniqueName: \"kubernetes.io/projected/b3fd641f-23d3-4d70-af64-66c3507eff49-kube-api-access-kt224\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.537302 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.537312 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.537320 4907 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.537328 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.579511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3fd641f-23d3-4d70-af64-66c3507eff49" (UID: "b3fd641f-23d3-4d70-af64-66c3507eff49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.639107 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3fd641f-23d3-4d70-af64-66c3507eff49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:18 crc kubenswrapper[4907]: I0226 16:04:18.664406 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.454945 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xvxcj"] Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.461915 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xvxcj"] Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.549771 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dwb5n"] Feb 26 16:04:19 crc kubenswrapper[4907]: E0226 16:04:19.550169 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3fd641f-23d3-4d70-af64-66c3507eff49" containerName="keystone-bootstrap" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.550188 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3fd641f-23d3-4d70-af64-66c3507eff49" containerName="keystone-bootstrap" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.550413 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3fd641f-23d3-4d70-af64-66c3507eff49" containerName="keystone-bootstrap" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.551073 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.553010 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.553825 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vv59s" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.554022 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.554391 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.554570 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.563536 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dwb5n"] Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.658991 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-config-data\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.659340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-combined-ca-bundle\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.659415 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg2n2\" (UniqueName: \"kubernetes.io/projected/e0a55626-b305-4e22-aec1-24832bec9a9f-kube-api-access-cg2n2\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.659454 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-scripts\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.659514 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-credential-keys\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.659551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-fernet-keys\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.761351 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-credential-keys\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.761401 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-fernet-keys\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.761464 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-config-data\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.761503 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-combined-ca-bundle\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.761563 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg2n2\" (UniqueName: \"kubernetes.io/projected/e0a55626-b305-4e22-aec1-24832bec9a9f-kube-api-access-cg2n2\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.761619 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-scripts\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.771463 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-config-data\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.772087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-scripts\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.955386 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-credential-keys\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.955401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-combined-ca-bundle\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.955981 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-fernet-keys\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:19 crc kubenswrapper[4907]: I0226 16:04:19.996039 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg2n2\" (UniqueName: \"kubernetes.io/projected/e0a55626-b305-4e22-aec1-24832bec9a9f-kube-api-access-cg2n2\") pod \"keystone-bootstrap-dwb5n\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:20 crc kubenswrapper[4907]: I0226 16:04:20.143949 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3fd641f-23d3-4d70-af64-66c3507eff49" path="/var/lib/kubelet/pods/b3fd641f-23d3-4d70-af64-66c3507eff49/volumes" Feb 26 16:04:20 crc kubenswrapper[4907]: I0226 16:04:20.187497 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.524533 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.572623 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f","Type":"ContainerDied","Data":"d0f0be26976f6f288e2853be7e4244ef66b7979305a75f5e0b36739868b17bea"} Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.572873 4907 scope.go:117] "RemoveContainer" containerID="a7aa80c5dd949f99eb0f35cfc3f25c43599eeac522fc68406f7eda1a501c7bcf" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.572986 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613134 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-internal-tls-certs\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613228 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-config-data\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613376 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-scripts\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613424 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-httpd-run\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613464 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-logs\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613517 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzmcs\" (UniqueName: \"kubernetes.io/projected/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-kube-api-access-lzmcs\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.613644 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-combined-ca-bundle\") pod \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\" (UID: \"e77cb0a0-f383-4bb5-b29e-e000a56a7a1f\") " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.614057 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.614293 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-logs" (OuterVolumeSpecName: "logs") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.619952 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.620875 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-scripts" (OuterVolumeSpecName: "scripts") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.637714 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-kube-api-access-lzmcs" (OuterVolumeSpecName: "kube-api-access-lzmcs") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "kube-api-access-lzmcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.648522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.664967 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-config-data" (OuterVolumeSpecName: "config-data") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.666161 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" (UID: "e77cb0a0-f383-4bb5-b29e-e000a56a7a1f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725012 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725060 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725074 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725114 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725128 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzmcs\" (UniqueName: \"kubernetes.io/projected/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-kube-api-access-lzmcs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725144 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725155 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.725168 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.743021 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 26 16:04:22 crc kubenswrapper[4907]: E0226 16:04:22.825767 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 26 16:04:22 crc kubenswrapper[4907]: E0226 16:04:22.825955 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nb4h656h8hcch587h5cdh8h68h664h55bhf4h7bh546h5d4h56ch4h97h66h64fh79h5cfh7dh58fh6ch66fh6h57bh594h54ch9ch5ch5fcq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pgdsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-594d447db9-7p2nh_openstack(b591cc9e-aa47-48dc-9462-a54cd3bbbaa8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.828534 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:22 crc kubenswrapper[4907]: E0226 16:04:22.828934 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-594d447db9-7p2nh" podUID="b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.913133 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.923547 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.931019 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:22 crc kubenswrapper[4907]: E0226 16:04:22.931383 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-httpd" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.931397 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-httpd" Feb 26 16:04:22 crc kubenswrapper[4907]: E0226 16:04:22.931413 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-log" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.931418 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-log" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.931571 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-log" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.931641 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" containerName="glance-httpd" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.932555 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.936176 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.936323 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 16:04:22 crc kubenswrapper[4907]: I0226 16:04:22.968618 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032296 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032401 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032691 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032902 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.032987 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.033153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zhrb\" (UniqueName: \"kubernetes.io/projected/2b1253ca-7753-4742-afc4-e786e4dcc6e0-kube-api-access-4zhrb\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134680 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134741 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134779 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134845 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134903 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134937 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.134964 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.135001 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zhrb\" (UniqueName: \"kubernetes.io/projected/2b1253ca-7753-4742-afc4-e786e4dcc6e0-kube-api-access-4zhrb\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.135365 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.135488 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.135578 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-logs\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.140282 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.145240 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.146307 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.152030 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zhrb\" (UniqueName: \"kubernetes.io/projected/2b1253ca-7753-4742-afc4-e786e4dcc6e0-kube-api-access-4zhrb\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.153658 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.177852 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.254964 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.665019 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Feb 26 16:04:23 crc kubenswrapper[4907]: I0226 16:04:23.666032 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:04:24 crc kubenswrapper[4907]: I0226 16:04:24.142126 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77cb0a0-f383-4bb5-b29e-e000a56a7a1f" path="/var/lib/kubelet/pods/e77cb0a0-f383-4bb5-b29e-e000a56a7a1f/volumes" Feb 26 16:04:24 crc kubenswrapper[4907]: E0226 16:04:24.872698 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Feb 26 16:04:24 crc kubenswrapper[4907]: E0226 16:04:24.873080 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ln7sb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-6t72w_openstack(4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:04:24 crc kubenswrapper[4907]: E0226 16:04:24.874568 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-6t72w" podUID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.052934 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.060694 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188243 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-httpd-run\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188436 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-scripts\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188459 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-combined-ca-bundle\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188487 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-logs\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188535 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-config-data\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188552 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq5sf\" (UniqueName: \"kubernetes.io/projected/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d-kube-api-access-qq5sf\") pod \"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d\" (UID: \"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188611 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-public-tls-certs\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188627 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.188659 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8c7p\" (UniqueName: \"kubernetes.io/projected/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-kube-api-access-q8c7p\") pod \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\" (UID: \"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec\") " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.191946 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-logs" (OuterVolumeSpecName: "logs") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.192136 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.200216 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-kube-api-access-q8c7p" (OuterVolumeSpecName: "kube-api-access-q8c7p") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "kube-api-access-q8c7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.209821 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.210205 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-scripts" (OuterVolumeSpecName: "scripts") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.225845 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d-kube-api-access-qq5sf" (OuterVolumeSpecName: "kube-api-access-qq5sf") pod "b2b66b18-ac41-4d84-9ae1-5900c27d0d7d" (UID: "b2b66b18-ac41-4d84-9ae1-5900c27d0d7d"). InnerVolumeSpecName "kube-api-access-qq5sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.265788 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.293299 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.294774 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq5sf\" (UniqueName: \"kubernetes.io/projected/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d-kube-api-access-qq5sf\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.294940 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.295039 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.295146 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8c7p\" (UniqueName: \"kubernetes.io/projected/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-kube-api-access-q8c7p\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.295233 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.295317 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.295388 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.295467 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.328811 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-config-data" (OuterVolumeSpecName: "config-data") pod "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" (UID: "1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.378185 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.398536 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.398581 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.604268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" event={"ID":"b2b66b18-ac41-4d84-9ae1-5900c27d0d7d","Type":"ContainerDied","Data":"04cf445c772d52e3c125c160efb7a092e10254560adce75616a80cbb4e4b2416"} Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.604315 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535364-t8qd8" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.604323 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04cf445c772d52e3c125c160efb7a092e10254560adce75616a80cbb4e4b2416" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.609280 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.609525 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec","Type":"ContainerDied","Data":"078e6ff5dcf430000c862e73512ee8d0b1ced25c0ce25a6433cd23e2f06aba2a"} Feb 26 16:04:25 crc kubenswrapper[4907]: E0226 16:04:25.610528 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-6t72w" podUID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.662675 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.671923 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.695030 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:25 crc kubenswrapper[4907]: E0226 16:04:25.695808 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2b66b18-ac41-4d84-9ae1-5900c27d0d7d" containerName="oc" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.695830 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2b66b18-ac41-4d84-9ae1-5900c27d0d7d" containerName="oc" Feb 26 16:04:25 crc kubenswrapper[4907]: E0226 16:04:25.695886 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-log" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.695895 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-log" Feb 26 16:04:25 crc kubenswrapper[4907]: E0226 16:04:25.695939 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-httpd" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.695948 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-httpd" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.696224 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-log" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.696248 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2b66b18-ac41-4d84-9ae1-5900c27d0d7d" containerName="oc" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.696281 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" containerName="glance-httpd" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.697831 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.699947 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.700173 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 16:04:25 crc kubenswrapper[4907]: E0226 16:04:25.702740 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2b66b18_ac41_4d84_9ae1_5900c27d0d7d.slice\": RecentStats: unable to find data in memory cache]" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.735634 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806424 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806481 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7bx2\" (UniqueName: \"kubernetes.io/projected/361750c4-3d82-437e-abc0-4e20302d20cf-kube-api-access-w7bx2\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806613 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806693 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806722 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-logs\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.806752 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.908300 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909215 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909266 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7bx2\" (UniqueName: \"kubernetes.io/projected/361750c4-3d82-437e-abc0-4e20302d20cf-kube-api-access-w7bx2\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909311 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909361 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909428 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909441 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-logs\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.909939 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.910110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-logs\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.910200 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.910528 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.921166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-scripts\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.921439 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.929182 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-config-data\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.930676 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.937075 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7bx2\" (UniqueName: \"kubernetes.io/projected/361750c4-3d82-437e-abc0-4e20302d20cf-kube-api-access-w7bx2\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:25 crc kubenswrapper[4907]: I0226 16:04:25.940140 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " pod="openstack/glance-default-external-api-0" Feb 26 16:04:26 crc kubenswrapper[4907]: I0226 16:04:26.022789 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:04:26 crc kubenswrapper[4907]: I0226 16:04:26.145874 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec" path="/var/lib/kubelet/pods/1ca54bc7-66c3-4e7a-a4a5-7eea6f3d1fec/volumes" Feb 26 16:04:26 crc kubenswrapper[4907]: I0226 16:04:26.146431 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-mk4kx"] Feb 26 16:04:26 crc kubenswrapper[4907]: I0226 16:04:26.156800 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535358-mk4kx"] Feb 26 16:04:27 crc kubenswrapper[4907]: E0226 16:04:27.787453 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 26 16:04:27 crc kubenswrapper[4907]: E0226 16:04:27.787864 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5bfh5cchf9h5ch9dh67fhddh685h665hb5h594h8bh586h5cch5f7hcbh68h688h56dh6fh5f9h586h666h5f8h6ch8dh57fhf7h644h9bh88h55cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x7r8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-785d56fd9c-lc7sg_openstack(b3e0f652-e35c-49b8-abe3-9182b2026d08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:04:27 crc kubenswrapper[4907]: E0226 16:04:27.805480 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Feb 26 16:04:27 crc kubenswrapper[4907]: E0226 16:04:27.806101 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fch668hc9h645h7ch76h64dh557h95h99h5b6h698h59h668h66bh55h67fh6fhf7hcbhd8h66ch599h695h54dh66bh68dh584h54bhb9h59dh88q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwlhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-766d888d6c-8sqt7_openstack(90cdbc73-317b-4479-9908-3712b34ce77d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:04:27 crc kubenswrapper[4907]: E0226 16:04:27.807825 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-785d56fd9c-lc7sg" podUID="b3e0f652-e35c-49b8-abe3-9182b2026d08" Feb 26 16:04:27 crc kubenswrapper[4907]: E0226 16:04:27.810153 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-766d888d6c-8sqt7" podUID="90cdbc73-317b-4479-9908-3712b34ce77d" Feb 26 16:04:27 crc kubenswrapper[4907]: I0226 16:04:27.914792 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.050569 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgdsk\" (UniqueName: \"kubernetes.io/projected/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-kube-api-access-pgdsk\") pod \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.050660 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-logs\") pod \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.050700 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-horizon-secret-key\") pod \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.050728 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-scripts\") pod \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.050767 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-config-data\") pod \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\" (UID: \"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8\") " Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.052217 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-config-data" (OuterVolumeSpecName: "config-data") pod "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" (UID: "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.053034 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-logs" (OuterVolumeSpecName: "logs") pod "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" (UID: "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.053632 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-scripts" (OuterVolumeSpecName: "scripts") pod "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" (UID: "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.065344 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-kube-api-access-pgdsk" (OuterVolumeSpecName: "kube-api-access-pgdsk") pod "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" (UID: "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8"). InnerVolumeSpecName "kube-api-access-pgdsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.065752 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" (UID: "b591cc9e-aa47-48dc-9462-a54cd3bbbaa8"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.147732 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3d86bc-1eb6-4d67-a762-2000e20fcbd5" path="/var/lib/kubelet/pods/ac3d86bc-1eb6-4d67-a762-2000e20fcbd5/volumes" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.158120 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgdsk\" (UniqueName: \"kubernetes.io/projected/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-kube-api-access-pgdsk\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.158163 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.158174 4907 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.158187 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.158197 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.634069 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-594d447db9-7p2nh" event={"ID":"b591cc9e-aa47-48dc-9462-a54cd3bbbaa8","Type":"ContainerDied","Data":"fcaf6df33fe71f9f9fcd282abdc05a69c3e4a21483a529728e0ec777762ac0ce"} Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.634156 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-594d447db9-7p2nh" Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.706813 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-594d447db9-7p2nh"] Feb 26 16:04:28 crc kubenswrapper[4907]: I0226 16:04:28.717117 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-594d447db9-7p2nh"] Feb 26 16:04:30 crc kubenswrapper[4907]: I0226 16:04:30.141034 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b591cc9e-aa47-48dc-9462-a54cd3bbbaa8" path="/var/lib/kubelet/pods/b591cc9e-aa47-48dc-9462-a54cd3bbbaa8/volumes" Feb 26 16:04:33 crc kubenswrapper[4907]: I0226 16:04:33.664435 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 26 16:04:38 crc kubenswrapper[4907]: I0226 16:04:38.665549 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 26 16:04:43 crc kubenswrapper[4907]: I0226 16:04:43.666391 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.300292 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.357349 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-nb\") pod \"62d9c258-3e92-48cc-a4b2-7207c93a6346\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.357429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-dns-svc\") pod \"62d9c258-3e92-48cc-a4b2-7207c93a6346\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.357478 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-config\") pod \"62d9c258-3e92-48cc-a4b2-7207c93a6346\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.357565 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-sb\") pod \"62d9c258-3e92-48cc-a4b2-7207c93a6346\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.357696 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjbh4\" (UniqueName: \"kubernetes.io/projected/62d9c258-3e92-48cc-a4b2-7207c93a6346-kube-api-access-rjbh4\") pod \"62d9c258-3e92-48cc-a4b2-7207c93a6346\" (UID: \"62d9c258-3e92-48cc-a4b2-7207c93a6346\") " Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.382560 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d9c258-3e92-48cc-a4b2-7207c93a6346-kube-api-access-rjbh4" (OuterVolumeSpecName: "kube-api-access-rjbh4") pod "62d9c258-3e92-48cc-a4b2-7207c93a6346" (UID: "62d9c258-3e92-48cc-a4b2-7207c93a6346"). InnerVolumeSpecName "kube-api-access-rjbh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.410163 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-config" (OuterVolumeSpecName: "config") pod "62d9c258-3e92-48cc-a4b2-7207c93a6346" (UID: "62d9c258-3e92-48cc-a4b2-7207c93a6346"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.419169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62d9c258-3e92-48cc-a4b2-7207c93a6346" (UID: "62d9c258-3e92-48cc-a4b2-7207c93a6346"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.427628 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62d9c258-3e92-48cc-a4b2-7207c93a6346" (UID: "62d9c258-3e92-48cc-a4b2-7207c93a6346"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.434381 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62d9c258-3e92-48cc-a4b2-7207c93a6346" (UID: "62d9c258-3e92-48cc-a4b2-7207c93a6346"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.460861 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.460891 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.460903 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjbh4\" (UniqueName: \"kubernetes.io/projected/62d9c258-3e92-48cc-a4b2-7207c93a6346-kube-api-access-rjbh4\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.460911 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.460919 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62d9c258-3e92-48cc-a4b2-7207c93a6346-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.859808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" event={"ID":"62d9c258-3e92-48cc-a4b2-7207c93a6346","Type":"ContainerDied","Data":"a48aed336a0f131903b43291a333c026068552b800492ce4535ef8aee8254245"} Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.859857 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.895521 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xx2fj"] Feb 26 16:04:44 crc kubenswrapper[4907]: I0226 16:04:44.903064 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-xx2fj"] Feb 26 16:04:44 crc kubenswrapper[4907]: E0226 16:04:44.999770 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 26 16:04:45 crc kubenswrapper[4907]: E0226 16:04:44.999936 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9vdjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-slrvx_openstack(a02d2622-77ed-4949-95b5-4f5ae5f1c47d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:04:45 crc kubenswrapper[4907]: E0226 16:04:45.001087 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-slrvx" podUID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.018223 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.049730 4907 scope.go:117] "RemoveContainer" containerID="bbf40f72132b8463f042e5cbd6f1edf0663e56f8654c532a459a427e7d565513" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.057359 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.069930 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cdbc73-317b-4479-9908-3712b34ce77d-logs\") pod \"90cdbc73-317b-4479-9908-3712b34ce77d\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.070019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90cdbc73-317b-4479-9908-3712b34ce77d-horizon-secret-key\") pod \"90cdbc73-317b-4479-9908-3712b34ce77d\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.070075 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-scripts\") pod \"90cdbc73-317b-4479-9908-3712b34ce77d\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.070245 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-config-data\") pod \"90cdbc73-317b-4479-9908-3712b34ce77d\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.070280 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwlhz\" (UniqueName: \"kubernetes.io/projected/90cdbc73-317b-4479-9908-3712b34ce77d-kube-api-access-jwlhz\") pod \"90cdbc73-317b-4479-9908-3712b34ce77d\" (UID: \"90cdbc73-317b-4479-9908-3712b34ce77d\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.074118 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90cdbc73-317b-4479-9908-3712b34ce77d-logs" (OuterVolumeSpecName: "logs") pod "90cdbc73-317b-4479-9908-3712b34ce77d" (UID: "90cdbc73-317b-4479-9908-3712b34ce77d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.074726 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-scripts" (OuterVolumeSpecName: "scripts") pod "90cdbc73-317b-4479-9908-3712b34ce77d" (UID: "90cdbc73-317b-4479-9908-3712b34ce77d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.074746 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-config-data" (OuterVolumeSpecName: "config-data") pod "90cdbc73-317b-4479-9908-3712b34ce77d" (UID: "90cdbc73-317b-4479-9908-3712b34ce77d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.085981 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cdbc73-317b-4479-9908-3712b34ce77d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "90cdbc73-317b-4479-9908-3712b34ce77d" (UID: "90cdbc73-317b-4479-9908-3712b34ce77d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.094748 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cdbc73-317b-4479-9908-3712b34ce77d-kube-api-access-jwlhz" (OuterVolumeSpecName: "kube-api-access-jwlhz") pod "90cdbc73-317b-4479-9908-3712b34ce77d" (UID: "90cdbc73-317b-4479-9908-3712b34ce77d"). InnerVolumeSpecName "kube-api-access-jwlhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.174014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e0f652-e35c-49b8-abe3-9182b2026d08-logs\") pod \"b3e0f652-e35c-49b8-abe3-9182b2026d08\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.174092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-scripts\") pod \"b3e0f652-e35c-49b8-abe3-9182b2026d08\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.174140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-config-data\") pod \"b3e0f652-e35c-49b8-abe3-9182b2026d08\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.174171 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7r8q\" (UniqueName: \"kubernetes.io/projected/b3e0f652-e35c-49b8-abe3-9182b2026d08-kube-api-access-x7r8q\") pod \"b3e0f652-e35c-49b8-abe3-9182b2026d08\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.174386 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3e0f652-e35c-49b8-abe3-9182b2026d08-horizon-secret-key\") pod \"b3e0f652-e35c-49b8-abe3-9182b2026d08\" (UID: \"b3e0f652-e35c-49b8-abe3-9182b2026d08\") " Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175016 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90cdbc73-317b-4479-9908-3712b34ce77d-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175042 4907 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/90cdbc73-317b-4479-9908-3712b34ce77d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175056 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175066 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90cdbc73-317b-4479-9908-3712b34ce77d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175077 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwlhz\" (UniqueName: \"kubernetes.io/projected/90cdbc73-317b-4479-9908-3712b34ce77d-kube-api-access-jwlhz\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175701 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3e0f652-e35c-49b8-abe3-9182b2026d08-logs" (OuterVolumeSpecName: "logs") pod "b3e0f652-e35c-49b8-abe3-9182b2026d08" (UID: "b3e0f652-e35c-49b8-abe3-9182b2026d08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.175994 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-scripts" (OuterVolumeSpecName: "scripts") pod "b3e0f652-e35c-49b8-abe3-9182b2026d08" (UID: "b3e0f652-e35c-49b8-abe3-9182b2026d08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.177121 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-config-data" (OuterVolumeSpecName: "config-data") pod "b3e0f652-e35c-49b8-abe3-9182b2026d08" (UID: "b3e0f652-e35c-49b8-abe3-9182b2026d08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.180514 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3e0f652-e35c-49b8-abe3-9182b2026d08-kube-api-access-x7r8q" (OuterVolumeSpecName: "kube-api-access-x7r8q") pod "b3e0f652-e35c-49b8-abe3-9182b2026d08" (UID: "b3e0f652-e35c-49b8-abe3-9182b2026d08"). InnerVolumeSpecName "kube-api-access-x7r8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.181835 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3e0f652-e35c-49b8-abe3-9182b2026d08-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b3e0f652-e35c-49b8-abe3-9182b2026d08" (UID: "b3e0f652-e35c-49b8-abe3-9182b2026d08"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.276616 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.276652 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3e0f652-e35c-49b8-abe3-9182b2026d08-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.276663 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7r8q\" (UniqueName: \"kubernetes.io/projected/b3e0f652-e35c-49b8-abe3-9182b2026d08-kube-api-access-x7r8q\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.276675 4907 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b3e0f652-e35c-49b8-abe3-9182b2026d08-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.276686 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3e0f652-e35c-49b8-abe3-9182b2026d08-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.869006 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-766d888d6c-8sqt7" event={"ID":"90cdbc73-317b-4479-9908-3712b34ce77d","Type":"ContainerDied","Data":"01aa70d164b3641fb0d01a8d3c7102f4464476aa5a1d213ad74aaf6ade5a2d06"} Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.869096 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-766d888d6c-8sqt7" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.884533 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-785d56fd9c-lc7sg" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.885141 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-785d56fd9c-lc7sg" event={"ID":"b3e0f652-e35c-49b8-abe3-9182b2026d08","Type":"ContainerDied","Data":"24da7506b0b6159b838185bb5fca55fe31b5e5ab8b7a0bc24800b8ac9138d3a8"} Feb 26 16:04:45 crc kubenswrapper[4907]: E0226 16:04:45.889843 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-slrvx" podUID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.949657 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-766d888d6c-8sqt7"] Feb 26 16:04:45 crc kubenswrapper[4907]: I0226 16:04:45.959306 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-766d888d6c-8sqt7"] Feb 26 16:04:46 crc kubenswrapper[4907]: I0226 16:04:46.014369 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-785d56fd9c-lc7sg"] Feb 26 16:04:46 crc kubenswrapper[4907]: I0226 16:04:46.024121 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-785d56fd9c-lc7sg"] Feb 26 16:04:46 crc kubenswrapper[4907]: I0226 16:04:46.140850 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" path="/var/lib/kubelet/pods/62d9c258-3e92-48cc-a4b2-7207c93a6346/volumes" Feb 26 16:04:46 crc kubenswrapper[4907]: I0226 16:04:46.142338 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cdbc73-317b-4479-9908-3712b34ce77d" path="/var/lib/kubelet/pods/90cdbc73-317b-4479-9908-3712b34ce77d/volumes" Feb 26 16:04:46 crc kubenswrapper[4907]: I0226 16:04:46.143095 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3e0f652-e35c-49b8-abe3-9182b2026d08" path="/var/lib/kubelet/pods/b3e0f652-e35c-49b8-abe3-9182b2026d08/volumes" Feb 26 16:04:46 crc kubenswrapper[4907]: E0226 16:04:46.869228 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 26 16:04:46 crc kubenswrapper[4907]: E0226 16:04:46.869414 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfcpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xvvbl_openstack(c98fd629-273b-4c87-a07c-4a482064a5a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:04:46 crc kubenswrapper[4907]: E0226 16:04:46.870786 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xvvbl" podUID="c98fd629-273b-4c87-a07c-4a482064a5a3" Feb 26 16:04:46 crc kubenswrapper[4907]: I0226 16:04:46.940166 4907 scope.go:117] "RemoveContainer" containerID="8ee41a2cda68419d1d2f0335fc3443029fa9e5671b0a1f151d84918008aee590" Feb 26 16:04:46 crc kubenswrapper[4907]: E0226 16:04:46.940763 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-xvvbl" podUID="c98fd629-273b-4c87-a07c-4a482064a5a3" Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.113086 4907 scope.go:117] "RemoveContainer" containerID="f475f7cfb1e058af1119bda095b6f8623f43674bb4901a8294f0bf24d8e55701" Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.158057 4907 scope.go:117] "RemoveContainer" containerID="c7b7197fc16c5531ccf4f45093fd0c8f8d3d99749cd680b025e8890044887cce" Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.231429 4907 scope.go:117] "RemoveContainer" containerID="f2f7a47e0fc218334665a617bdfc44a9c0081ee1fa2a37cc44ccaab1cd2c78b1" Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.425730 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fccfb8496-4tqhr"] Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.613445 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76d88967b8-wmzcw"] Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.650506 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dwb5n"] Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.660468 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.913367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t72w" event={"ID":"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a","Type":"ContainerStarted","Data":"dab876c894465becdf3105b0fa5d4916964a7cd76e622ecfca3a9597922b7d13"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.916943 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dwb5n" event={"ID":"e0a55626-b305-4e22-aec1-24832bec9a9f","Type":"ContainerStarted","Data":"a6723ee64be82b655a1b30a04dd34c73206edf2a85f7fc50689b6d1a1d6df10a"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.916978 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dwb5n" event={"ID":"e0a55626-b305-4e22-aec1-24832bec9a9f","Type":"ContainerStarted","Data":"227321391fe768e307f691c67089bb81f55ae394fca74e845c90bd8e067c7d0b"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.918247 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"361750c4-3d82-437e-abc0-4e20302d20cf","Type":"ContainerStarted","Data":"2eedba418a9ee29b10b82662a95c406340fe1d301e7879dc616010c9ba3b8792"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.925029 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerStarted","Data":"a10277d73a2ffb2051463a8d07d910b9357b81428ff49db09862fcbccced53ff"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.933033 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerStarted","Data":"27ff02a081abc13eb6e0bbf653a995f4b273b9e2341ba135919bde79e0418fa6"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.945524 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-6t72w" podStartSLOduration=3.5569938629999998 podStartE2EDuration="49.945505874s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="2026-02-26 16:04:00.551873854 +0000 UTC m=+1303.070435703" lastFinishedPulling="2026-02-26 16:04:46.940385865 +0000 UTC m=+1349.458947714" observedRunningTime="2026-02-26 16:04:47.93320864 +0000 UTC m=+1350.451770489" watchObservedRunningTime="2026-02-26 16:04:47.945505874 +0000 UTC m=+1350.464067723" Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.952422 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerStarted","Data":"6a86b7b8900988216a5e3f196d54892989a14bb69517093b0a2fb2792a439ae8"} Feb 26 16:04:47 crc kubenswrapper[4907]: I0226 16:04:47.958147 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dwb5n" podStartSLOduration=28.958127425 podStartE2EDuration="28.958127425s" podCreationTimestamp="2026-02-26 16:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:47.951884416 +0000 UTC m=+1350.470446265" watchObservedRunningTime="2026-02-26 16:04:47.958127425 +0000 UTC m=+1350.476689274" Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.461950 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:04:48 crc kubenswrapper[4907]: W0226 16:04:48.464602 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b1253ca_7753_4742_afc4_e786e4dcc6e0.slice/crio-28df9e7d43daead14430fe810b52816e3c44f0f7b378b1dd2c357bd86dd166f1 WatchSource:0}: Error finding container 28df9e7d43daead14430fe810b52816e3c44f0f7b378b1dd2c357bd86dd166f1: Status 404 returned error can't find the container with id 28df9e7d43daead14430fe810b52816e3c44f0f7b378b1dd2c357bd86dd166f1 Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.530258 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.530323 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.667434 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-xx2fj" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.964776 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b1253ca-7753-4742-afc4-e786e4dcc6e0","Type":"ContainerStarted","Data":"28df9e7d43daead14430fe810b52816e3c44f0f7b378b1dd2c357bd86dd166f1"} Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.967317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerStarted","Data":"3f95094dd73a53aa831d3c7f002970271a280a470ee37d101788fd1290991f04"} Feb 26 16:04:48 crc kubenswrapper[4907]: I0226 16:04:48.977545 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"361750c4-3d82-437e-abc0-4e20302d20cf","Type":"ContainerStarted","Data":"5b645c4cc55c466b58e79b5f1292c773cf90139e56d5e08d260e34f754fdac57"} Feb 26 16:04:50 crc kubenswrapper[4907]: I0226 16:04:50.005203 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerStarted","Data":"9144e6b639b970283c01c629c241a2a9219d9cb2523695382ab009f93d3cc3eb"} Feb 26 16:04:50 crc kubenswrapper[4907]: I0226 16:04:50.027956 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerStarted","Data":"738b664b1aa529968ea7a0fe87f5d35158f6fc7d127775ad3c58c9db205eeeb8"} Feb 26 16:04:50 crc kubenswrapper[4907]: I0226 16:04:50.033052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b1253ca-7753-4742-afc4-e786e4dcc6e0","Type":"ContainerStarted","Data":"2b1e2a238cf4f1c016f462472860a67cb5f341cf5f7c9b5a6d7a1cc54338beaa"} Feb 26 16:04:50 crc kubenswrapper[4907]: I0226 16:04:50.038873 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerStarted","Data":"5f606b9ab89532e105117c7cf76e6d48e275002733a615d726e58c1777c18aad"} Feb 26 16:04:50 crc kubenswrapper[4907]: I0226 16:04:50.066742 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fccfb8496-4tqhr" podStartSLOduration=42.568160796 podStartE2EDuration="43.066725489s" podCreationTimestamp="2026-02-26 16:04:07 +0000 UTC" firstStartedPulling="2026-02-26 16:04:47.435736753 +0000 UTC m=+1349.954298602" lastFinishedPulling="2026-02-26 16:04:47.934301446 +0000 UTC m=+1350.452863295" observedRunningTime="2026-02-26 16:04:50.065629682 +0000 UTC m=+1352.584191531" watchObservedRunningTime="2026-02-26 16:04:50.066725489 +0000 UTC m=+1352.585287338" Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.049989 4907 generic.go:334] "Generic (PLEG): container finished" podID="1ae29e7c-7f4a-492f-b10d-2badd4d606aa" containerID="7d8384c340f47ec5e939b3ead6a4f8659accead8c6b60732d40de8114e9324a0" exitCode=0 Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.050051 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sg95t" event={"ID":"1ae29e7c-7f4a-492f-b10d-2badd4d606aa","Type":"ContainerDied","Data":"7d8384c340f47ec5e939b3ead6a4f8659accead8c6b60732d40de8114e9324a0"} Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.052851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"361750c4-3d82-437e-abc0-4e20302d20cf","Type":"ContainerStarted","Data":"bcdf7f251072c281b799d39208a89ff8fa1387f0ca8230ec0b2263b3f0d3c06e"} Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.055687 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b1253ca-7753-4742-afc4-e786e4dcc6e0","Type":"ContainerStarted","Data":"61eed4a9713c166c961dabc5fe450a1963bba8e9756ddc8b568dd385b092c384"} Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.060159 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerStarted","Data":"c2b6ec3e96a2871e49421792b819e7d8811902b2acc4ebf5cb6213f4794ef38f"} Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.099886 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=29.099867686 podStartE2EDuration="29.099867686s" podCreationTimestamp="2026-02-26 16:04:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:51.098688717 +0000 UTC m=+1353.617250586" watchObservedRunningTime="2026-02-26 16:04:51.099867686 +0000 UTC m=+1353.618429535" Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.134605 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76d88967b8-wmzcw" podStartSLOduration=42.987233581 podStartE2EDuration="44.134575854s" podCreationTimestamp="2026-02-26 16:04:07 +0000 UTC" firstStartedPulling="2026-02-26 16:04:47.563372577 +0000 UTC m=+1350.081934416" lastFinishedPulling="2026-02-26 16:04:48.71071484 +0000 UTC m=+1351.229276689" observedRunningTime="2026-02-26 16:04:51.131392958 +0000 UTC m=+1353.649954807" watchObservedRunningTime="2026-02-26 16:04:51.134575854 +0000 UTC m=+1353.653137703" Feb 26 16:04:51 crc kubenswrapper[4907]: I0226 16:04:51.158508 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=26.158491074 podStartE2EDuration="26.158491074s" podCreationTimestamp="2026-02-26 16:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:04:51.157196283 +0000 UTC m=+1353.675758142" watchObservedRunningTime="2026-02-26 16:04:51.158491074 +0000 UTC m=+1353.677052923" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.473847 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sg95t" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.548361 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-config\") pod \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.548523 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-combined-ca-bundle\") pod \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.548575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwr5c\" (UniqueName: \"kubernetes.io/projected/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-kube-api-access-kwr5c\") pod \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\" (UID: \"1ae29e7c-7f4a-492f-b10d-2badd4d606aa\") " Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.568201 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-kube-api-access-kwr5c" (OuterVolumeSpecName: "kube-api-access-kwr5c") pod "1ae29e7c-7f4a-492f-b10d-2badd4d606aa" (UID: "1ae29e7c-7f4a-492f-b10d-2badd4d606aa"). InnerVolumeSpecName "kube-api-access-kwr5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.594730 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ae29e7c-7f4a-492f-b10d-2badd4d606aa" (UID: "1ae29e7c-7f4a-492f-b10d-2badd4d606aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.604710 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-config" (OuterVolumeSpecName: "config") pod "1ae29e7c-7f4a-492f-b10d-2badd4d606aa" (UID: "1ae29e7c-7f4a-492f-b10d-2badd4d606aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.650384 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwr5c\" (UniqueName: \"kubernetes.io/projected/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-kube-api-access-kwr5c\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.650635 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:52 crc kubenswrapper[4907]: I0226 16:04:52.650762 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae29e7c-7f4a-492f-b10d-2badd4d606aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.083548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sg95t" event={"ID":"1ae29e7c-7f4a-492f-b10d-2badd4d606aa","Type":"ContainerDied","Data":"9833f807c8825f42dda8377b7673fa0cbcdf5e3409f919fa6a013360f910b6df"} Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.083613 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9833f807c8825f42dda8377b7673fa0cbcdf5e3409f919fa6a013360f910b6df" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.083681 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sg95t" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.255751 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.255799 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.255811 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.255819 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.324708 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qdsb5"] Feb 26 16:04:53 crc kubenswrapper[4907]: E0226 16:04:53.325029 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="init" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.325041 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="init" Feb 26 16:04:53 crc kubenswrapper[4907]: E0226 16:04:53.325073 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.325079 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" Feb 26 16:04:53 crc kubenswrapper[4907]: E0226 16:04:53.325092 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae29e7c-7f4a-492f-b10d-2badd4d606aa" containerName="neutron-db-sync" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.325099 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae29e7c-7f4a-492f-b10d-2badd4d606aa" containerName="neutron-db-sync" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.325249 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae29e7c-7f4a-492f-b10d-2badd4d606aa" containerName="neutron-db-sync" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.325273 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d9c258-3e92-48cc-a4b2-7207c93a6346" containerName="dnsmasq-dns" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.326111 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.365367 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.365573 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qdsb5"] Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.374774 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.375018 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.375065 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.375090 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns7zc\" (UniqueName: \"kubernetes.io/projected/72c07a62-59c5-47d0-8c74-766322267226-kube-api-access-ns7zc\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.375116 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.375221 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-config\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.393681 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.478679 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-config\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.479600 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.479454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-config\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.479776 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.479821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.479838 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns7zc\" (UniqueName: \"kubernetes.io/projected/72c07a62-59c5-47d0-8c74-766322267226-kube-api-access-ns7zc\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.479861 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.480170 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-svc\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.480445 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.481187 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.481488 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.523578 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dbb49ff7b-8r7kc"] Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.531751 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.539969 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.540227 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.540407 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.540521 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4ppm5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.561403 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns7zc\" (UniqueName: \"kubernetes.io/projected/72c07a62-59c5-47d0-8c74-766322267226-kube-api-access-ns7zc\") pod \"dnsmasq-dns-6b7b667979-qdsb5\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.583304 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-combined-ca-bundle\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.583451 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtdmw\" (UniqueName: \"kubernetes.io/projected/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-kube-api-access-qtdmw\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.583596 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-ovndb-tls-certs\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.583764 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-config\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.583792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-httpd-config\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.613661 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dbb49ff7b-8r7kc"] Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.651302 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.686218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-combined-ca-bundle\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.686272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtdmw\" (UniqueName: \"kubernetes.io/projected/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-kube-api-access-qtdmw\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.686800 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-ovndb-tls-certs\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.686895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-config\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.686920 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-httpd-config\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.692816 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-combined-ca-bundle\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.720709 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-config\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.721271 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-httpd-config\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.731467 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-ovndb-tls-certs\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.742621 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtdmw\" (UniqueName: \"kubernetes.io/projected/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-kube-api-access-qtdmw\") pod \"neutron-6dbb49ff7b-8r7kc\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:53 crc kubenswrapper[4907]: I0226 16:04:53.941501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:04:54 crc kubenswrapper[4907]: I0226 16:04:54.355818 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qdsb5"] Feb 26 16:04:54 crc kubenswrapper[4907]: I0226 16:04:54.730500 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dbb49ff7b-8r7kc"] Feb 26 16:04:55 crc kubenswrapper[4907]: I0226 16:04:55.112107 4907 generic.go:334] "Generic (PLEG): container finished" podID="72c07a62-59c5-47d0-8c74-766322267226" containerID="f0eb829c22e21a48b9e9adf06599e6d98e845d62ff5475408399a9a5d9f46967" exitCode=0 Feb 26 16:04:55 crc kubenswrapper[4907]: I0226 16:04:55.112367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" event={"ID":"72c07a62-59c5-47d0-8c74-766322267226","Type":"ContainerDied","Data":"f0eb829c22e21a48b9e9adf06599e6d98e845d62ff5475408399a9a5d9f46967"} Feb 26 16:04:55 crc kubenswrapper[4907]: I0226 16:04:55.112816 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" event={"ID":"72c07a62-59c5-47d0-8c74-766322267226","Type":"ContainerStarted","Data":"88ea84ff9ae452e264570e7cc71bebc39d32b787300de3745d8d1fea1e2ee95e"} Feb 26 16:04:55 crc kubenswrapper[4907]: I0226 16:04:55.127534 4907 generic.go:334] "Generic (PLEG): container finished" podID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" containerID="dab876c894465becdf3105b0fa5d4916964a7cd76e622ecfca3a9597922b7d13" exitCode=0 Feb 26 16:04:55 crc kubenswrapper[4907]: I0226 16:04:55.128194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t72w" event={"ID":"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a","Type":"ContainerDied","Data":"dab876c894465becdf3105b0fa5d4916964a7cd76e622ecfca3a9597922b7d13"} Feb 26 16:04:55 crc kubenswrapper[4907]: I0226 16:04:55.937480 4907 scope.go:117] "RemoveContainer" containerID="4f97dc5b43bb9e39af17c85fe883c1f94bba5cd5baf28e733e55dd9e924078b1" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.023827 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.024137 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.024148 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.024157 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.054068 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.093016 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.420582 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8656797c97-kv5w2"] Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.422272 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.427661 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.427895 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.434183 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8656797c97-kv5w2"] Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.543997 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-combined-ca-bundle\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.544109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-httpd-config\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.544133 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-internal-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.544150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-config\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.544190 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwbd9\" (UniqueName: \"kubernetes.io/projected/5a680379-891d-45b5-bfac-04c44ab3e5d4-kube-api-access-nwbd9\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.544228 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-public-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.544246 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-ovndb-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-config\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645662 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwbd9\" (UniqueName: \"kubernetes.io/projected/5a680379-891d-45b5-bfac-04c44ab3e5d4-kube-api-access-nwbd9\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645701 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-public-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645722 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-ovndb-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645782 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-combined-ca-bundle\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645828 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-httpd-config\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.645843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-internal-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.653199 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-config\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.653740 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-combined-ca-bundle\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.654164 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-public-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.656829 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-ovndb-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.657519 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-httpd-config\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.667180 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwbd9\" (UniqueName: \"kubernetes.io/projected/5a680379-891d-45b5-bfac-04c44ab3e5d4-kube-api-access-nwbd9\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.687487 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-internal-tls-certs\") pod \"neutron-8656797c97-kv5w2\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:56 crc kubenswrapper[4907]: I0226 16:04:56.825249 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:04:57 crc kubenswrapper[4907]: I0226 16:04:57.749042 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:57 crc kubenswrapper[4907]: I0226 16:04:57.749389 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:04:58 crc kubenswrapper[4907]: I0226 16:04:58.154943 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:58 crc kubenswrapper[4907]: I0226 16:04:58.155992 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:04:59 crc kubenswrapper[4907]: I0226 16:04:59.168214 4907 generic.go:334] "Generic (PLEG): container finished" podID="e0a55626-b305-4e22-aec1-24832bec9a9f" containerID="a6723ee64be82b655a1b30a04dd34c73206edf2a85f7fc50689b6d1a1d6df10a" exitCode=0 Feb 26 16:04:59 crc kubenswrapper[4907]: I0226 16:04:59.168311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dwb5n" event={"ID":"e0a55626-b305-4e22-aec1-24832bec9a9f","Type":"ContainerDied","Data":"a6723ee64be82b655a1b30a04dd34c73206edf2a85f7fc50689b6d1a1d6df10a"} Feb 26 16:05:00 crc kubenswrapper[4907]: W0226 16:05:00.470963 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41b49bfa_e783_4c0f_a0f6_f8dfdd5771d3.slice/crio-63a885ce395c9f796d02aadd511cfc684cb514a02c72dd22708f6eb5b672485b WatchSource:0}: Error finding container 63a885ce395c9f796d02aadd511cfc684cb514a02c72dd22708f6eb5b672485b: Status 404 returned error can't find the container with id 63a885ce395c9f796d02aadd511cfc684cb514a02c72dd22708f6eb5b672485b Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.713354 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.719140 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.806049 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t72w" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.829610 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.829714 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.830354 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-scripts\") pod \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.830478 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-combined-ca-bundle\") pod \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.830500 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln7sb\" (UniqueName: \"kubernetes.io/projected/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-kube-api-access-ln7sb\") pod \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.830624 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-logs\") pod \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.830720 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-config-data\") pod \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\" (UID: \"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.843350 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-logs" (OuterVolumeSpecName: "logs") pod "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" (UID: "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.853056 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.857998 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-kube-api-access-ln7sb" (OuterVolumeSpecName: "kube-api-access-ln7sb") pod "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" (UID: "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a"). InnerVolumeSpecName "kube-api-access-ln7sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.874981 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.901828 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" (UID: "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.908822 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-scripts" (OuterVolumeSpecName: "scripts") pod "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" (UID: "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.932732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-config-data\") pod \"e0a55626-b305-4e22-aec1-24832bec9a9f\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.932777 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-combined-ca-bundle\") pod \"e0a55626-b305-4e22-aec1-24832bec9a9f\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.932824 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-fernet-keys\") pod \"e0a55626-b305-4e22-aec1-24832bec9a9f\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.932932 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg2n2\" (UniqueName: \"kubernetes.io/projected/e0a55626-b305-4e22-aec1-24832bec9a9f-kube-api-access-cg2n2\") pod \"e0a55626-b305-4e22-aec1-24832bec9a9f\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.932971 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-credential-keys\") pod \"e0a55626-b305-4e22-aec1-24832bec9a9f\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.932996 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-scripts\") pod \"e0a55626-b305-4e22-aec1-24832bec9a9f\" (UID: \"e0a55626-b305-4e22-aec1-24832bec9a9f\") " Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.933566 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.933602 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.933616 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln7sb\" (UniqueName: \"kubernetes.io/projected/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-kube-api-access-ln7sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.933628 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.988132 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0a55626-b305-4e22-aec1-24832bec9a9f-kube-api-access-cg2n2" (OuterVolumeSpecName: "kube-api-access-cg2n2") pod "e0a55626-b305-4e22-aec1-24832bec9a9f" (UID: "e0a55626-b305-4e22-aec1-24832bec9a9f"). InnerVolumeSpecName "kube-api-access-cg2n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:00 crc kubenswrapper[4907]: I0226 16:05:00.988218 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-scripts" (OuterVolumeSpecName: "scripts") pod "e0a55626-b305-4e22-aec1-24832bec9a9f" (UID: "e0a55626-b305-4e22-aec1-24832bec9a9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.015833 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e0a55626-b305-4e22-aec1-24832bec9a9f" (UID: "e0a55626-b305-4e22-aec1-24832bec9a9f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.018318 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e0a55626-b305-4e22-aec1-24832bec9a9f" (UID: "e0a55626-b305-4e22-aec1-24832bec9a9f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.044056 4907 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.044114 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg2n2\" (UniqueName: \"kubernetes.io/projected/e0a55626-b305-4e22-aec1-24832bec9a9f-kube-api-access-cg2n2\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.044127 4907 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.044135 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.047923 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0a55626-b305-4e22-aec1-24832bec9a9f" (UID: "e0a55626-b305-4e22-aec1-24832bec9a9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.054482 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-config-data" (OuterVolumeSpecName: "config-data") pod "e0a55626-b305-4e22-aec1-24832bec9a9f" (UID: "e0a55626-b305-4e22-aec1-24832bec9a9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.067566 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-config-data" (OuterVolumeSpecName: "config-data") pod "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" (UID: "4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.148931 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.148963 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0a55626-b305-4e22-aec1-24832bec9a9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.148973 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.212244 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" event={"ID":"72c07a62-59c5-47d0-8c74-766322267226","Type":"ContainerStarted","Data":"d725e999a2b38af5b26a31d0df3f5b6ff6575a18dd3299c2b1a572449d67e5c5"} Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.212675 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.214871 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-6t72w" event={"ID":"4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a","Type":"ContainerDied","Data":"4e5c036af10db7b9bb532574863bd6a6278d5a6d948e407606249b2921e9eb0b"} Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.214905 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e5c036af10db7b9bb532574863bd6a6278d5a6d948e407606249b2921e9eb0b" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.214959 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-6t72w" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.248671 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerStarted","Data":"28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239"} Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.248720 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerStarted","Data":"63a885ce395c9f796d02aadd511cfc684cb514a02c72dd22708f6eb5b672485b"} Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.254412 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" podStartSLOduration=8.254395935 podStartE2EDuration="8.254395935s" podCreationTimestamp="2026-02-26 16:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:01.238526176 +0000 UTC m=+1363.757088025" watchObservedRunningTime="2026-02-26 16:05:01.254395935 +0000 UTC m=+1363.772957774" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.260451 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dwb5n" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.262659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dwb5n" event={"ID":"e0a55626-b305-4e22-aec1-24832bec9a9f","Type":"ContainerDied","Data":"227321391fe768e307f691c67089bb81f55ae394fca74e845c90bd8e067c7d0b"} Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.262708 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="227321391fe768e307f691c67089bb81f55ae394fca74e845c90bd8e067c7d0b" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.402744 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-86f7f47947-xzhlh"] Feb 26 16:05:01 crc kubenswrapper[4907]: E0226 16:05:01.403985 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0a55626-b305-4e22-aec1-24832bec9a9f" containerName="keystone-bootstrap" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.404018 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0a55626-b305-4e22-aec1-24832bec9a9f" containerName="keystone-bootstrap" Feb 26 16:05:01 crc kubenswrapper[4907]: E0226 16:05:01.404043 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" containerName="placement-db-sync" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.404050 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" containerName="placement-db-sync" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.404249 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" containerName="placement-db-sync" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.404289 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0a55626-b305-4e22-aec1-24832bec9a9f" containerName="keystone-bootstrap" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.405184 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.435895 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.436128 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.442396 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.442616 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.442775 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vv59s" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.442960 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.447698 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86f7f47947-xzhlh"] Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.460193 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8656797c97-kv5w2"] Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.464988 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-combined-ca-bundle\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465117 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgcgj\" (UniqueName: \"kubernetes.io/projected/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-kube-api-access-lgcgj\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465141 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-scripts\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465225 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-internal-tls-certs\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465267 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-public-tls-certs\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-fernet-keys\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465345 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-config-data\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.465428 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-credential-keys\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: W0226 16:05:01.473817 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a680379_891d_45b5_bfac_04c44ab3e5d4.slice/crio-a2760191b1a16549429a54c2da36f05b3029d2e4e7b2805ce240cb3e3102f609 WatchSource:0}: Error finding container a2760191b1a16549429a54c2da36f05b3029d2e4e7b2805ce240cb3e3102f609: Status 404 returned error can't find the container with id a2760191b1a16549429a54c2da36f05b3029d2e4e7b2805ce240cb3e3102f609 Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-internal-tls-certs\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567164 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-public-tls-certs\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567194 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-fernet-keys\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567217 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-config-data\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-credential-keys\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-combined-ca-bundle\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-scripts\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.567365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgcgj\" (UniqueName: \"kubernetes.io/projected/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-kube-api-access-lgcgj\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.577449 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-combined-ca-bundle\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.578018 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-public-tls-certs\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.578092 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-credential-keys\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.579184 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-scripts\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.585067 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-internal-tls-certs\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.591158 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgcgj\" (UniqueName: \"kubernetes.io/projected/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-kube-api-access-lgcgj\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.592534 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-fernet-keys\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.599149 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb4b5b1f-5a7e-4bdd-a013-988c8057f16c-config-data\") pod \"keystone-86f7f47947-xzhlh\" (UID: \"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c\") " pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:01 crc kubenswrapper[4907]: I0226 16:05:01.709990 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.343878 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-slrvx" event={"ID":"a02d2622-77ed-4949-95b5-4f5ae5f1c47d","Type":"ContainerStarted","Data":"e0735ae758d881be5c47962c25e0c5beb10fc882e59ecc8f9b0a5ceff33abcb6"} Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.389413 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8656797c97-kv5w2" event={"ID":"5a680379-891d-45b5-bfac-04c44ab3e5d4","Type":"ContainerStarted","Data":"2530bd04fa0ba0bf4cd0cfd6c481e904801ab981339361f5fcbb6a22f455fa21"} Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.389455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8656797c97-kv5w2" event={"ID":"5a680379-891d-45b5-bfac-04c44ab3e5d4","Type":"ContainerStarted","Data":"a2760191b1a16549429a54c2da36f05b3029d2e4e7b2805ce240cb3e3102f609"} Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.485724 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerStarted","Data":"83485422366b12c8fef61540f7edc4a0ae20864be27c33bd1f6ef3405bd319b0"} Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.491535 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.499637 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7c4f4876c6-sk5mm"] Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.523170 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.534779 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-slrvx" podStartSLOduration=3.734838156 podStartE2EDuration="1m4.53476432s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="2026-02-26 16:04:00.65983173 +0000 UTC m=+1303.178393569" lastFinishedPulling="2026-02-26 16:05:01.459757884 +0000 UTC m=+1363.978319733" observedRunningTime="2026-02-26 16:05:02.389070204 +0000 UTC m=+1364.907632053" watchObservedRunningTime="2026-02-26 16:05:02.53476432 +0000 UTC m=+1365.053326169" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.550787 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.551797 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerStarted","Data":"8eafec6f37f7ac496b7287bbb919329db3d0805025d4c66fd6ae06dc3b992a3e"} Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.551024 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.551076 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.551125 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.575028 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-hcc7t" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.584194 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c4f4876c6-sk5mm"] Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.614989 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dbb49ff7b-8r7kc" podStartSLOduration=9.614973523 podStartE2EDuration="9.614973523s" podCreationTimestamp="2026-02-26 16:04:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:02.532531216 +0000 UTC m=+1365.051093065" watchObservedRunningTime="2026-02-26 16:05:02.614973523 +0000 UTC m=+1365.133535372" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.679767 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-86f7f47947-xzhlh"] Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.734767 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-config-data\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.734880 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-combined-ca-bundle\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.734927 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-scripts\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.734965 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85da2141-e440-4d43-8f34-47c130cedfe3-logs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.734999 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-internal-tls-certs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.735019 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p4vf\" (UniqueName: \"kubernetes.io/projected/85da2141-e440-4d43-8f34-47c130cedfe3-kube-api-access-9p4vf\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.735158 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-public-tls-certs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836695 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85da2141-e440-4d43-8f34-47c130cedfe3-logs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836744 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-internal-tls-certs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p4vf\" (UniqueName: \"kubernetes.io/projected/85da2141-e440-4d43-8f34-47c130cedfe3-kube-api-access-9p4vf\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836863 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-public-tls-certs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836889 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-config-data\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836923 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-combined-ca-bundle\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.836957 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-scripts\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.837137 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85da2141-e440-4d43-8f34-47c130cedfe3-logs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.842872 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-scripts\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.843366 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-config-data\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.846116 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-public-tls-certs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.849707 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-internal-tls-certs\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.854436 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85da2141-e440-4d43-8f34-47c130cedfe3-combined-ca-bundle\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.909320 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p4vf\" (UniqueName: \"kubernetes.io/projected/85da2141-e440-4d43-8f34-47c130cedfe3-kube-api-access-9p4vf\") pod \"placement-7c4f4876c6-sk5mm\" (UID: \"85da2141-e440-4d43-8f34-47c130cedfe3\") " pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:02 crc kubenswrapper[4907]: I0226 16:05:02.969147 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:03 crc kubenswrapper[4907]: W0226 16:05:03.510430 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85da2141_e440_4d43_8f34_47c130cedfe3.slice/crio-1525b9fab5e57a32948ec3ff4b647a9d09c5d588bfeea2ac92045ad1aab1e985 WatchSource:0}: Error finding container 1525b9fab5e57a32948ec3ff4b647a9d09c5d588bfeea2ac92045ad1aab1e985: Status 404 returned error can't find the container with id 1525b9fab5e57a32948ec3ff4b647a9d09c5d588bfeea2ac92045ad1aab1e985 Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.517351 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7c4f4876c6-sk5mm"] Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.564170 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xvvbl" event={"ID":"c98fd629-273b-4c87-a07c-4a482064a5a3","Type":"ContainerStarted","Data":"0b6026eb615d38ba839f0ba2755147d5e3528ad58900bc626575611dfdbfdd95"} Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.575963 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8656797c97-kv5w2" event={"ID":"5a680379-891d-45b5-bfac-04c44ab3e5d4","Type":"ContainerStarted","Data":"5e0e4bf5b7cdb36b844161a8e7adefce2272f720c39f3b8506c61b906f2a736d"} Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.577395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.590343 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86f7f47947-xzhlh" event={"ID":"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c","Type":"ContainerStarted","Data":"26e6ac890bc888c5ba954dc3c369f230900f74ca8c940555e30395272b7d95ef"} Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.590387 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-86f7f47947-xzhlh" event={"ID":"bb4b5b1f-5a7e-4bdd-a013-988c8057f16c","Type":"ContainerStarted","Data":"50b0260e4196598228b6f3c2e7d0676fe965826260769c8dd83aa17f8546d660"} Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.591503 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.595487 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c4f4876c6-sk5mm" event={"ID":"85da2141-e440-4d43-8f34-47c130cedfe3","Type":"ContainerStarted","Data":"1525b9fab5e57a32948ec3ff4b647a9d09c5d588bfeea2ac92045ad1aab1e985"} Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.648236 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8656797c97-kv5w2" podStartSLOduration=7.648218232 podStartE2EDuration="7.648218232s" podCreationTimestamp="2026-02-26 16:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:03.609340485 +0000 UTC m=+1366.127902344" watchObservedRunningTime="2026-02-26 16:05:03.648218232 +0000 UTC m=+1366.166780071" Feb 26 16:05:03 crc kubenswrapper[4907]: I0226 16:05:03.665676 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-86f7f47947-xzhlh" podStartSLOduration=2.665657259 podStartE2EDuration="2.665657259s" podCreationTimestamp="2026-02-26 16:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:03.645863767 +0000 UTC m=+1366.164425636" watchObservedRunningTime="2026-02-26 16:05:03.665657259 +0000 UTC m=+1366.184219108" Feb 26 16:05:04 crc kubenswrapper[4907]: I0226 16:05:04.614683 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c4f4876c6-sk5mm" event={"ID":"85da2141-e440-4d43-8f34-47c130cedfe3","Type":"ContainerStarted","Data":"00526e2471416ec761b463b8b5b0e09d733dbdfa856e4afaa95a41998c8d9d69"} Feb 26 16:05:04 crc kubenswrapper[4907]: I0226 16:05:04.650193 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xvvbl" podStartSLOduration=5.5961313520000004 podStartE2EDuration="1m7.650148555s" podCreationTimestamp="2026-02-26 16:03:57 +0000 UTC" firstStartedPulling="2026-02-26 16:03:59.670166669 +0000 UTC m=+1302.188728518" lastFinishedPulling="2026-02-26 16:05:01.724183872 +0000 UTC m=+1364.242745721" observedRunningTime="2026-02-26 16:05:04.63401508 +0000 UTC m=+1367.152576919" watchObservedRunningTime="2026-02-26 16:05:04.650148555 +0000 UTC m=+1367.168710404" Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.636189 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/0.log" Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.639304 4907 generic.go:334] "Generic (PLEG): container finished" podID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerID="83485422366b12c8fef61540f7edc4a0ae20864be27c33bd1f6ef3405bd319b0" exitCode=1 Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.639373 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerDied","Data":"83485422366b12c8fef61540f7edc4a0ae20864be27c33bd1f6ef3405bd319b0"} Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.640068 4907 scope.go:117] "RemoveContainer" containerID="83485422366b12c8fef61540f7edc4a0ae20864be27c33bd1f6ef3405bd319b0" Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.648403 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7c4f4876c6-sk5mm" event={"ID":"85da2141-e440-4d43-8f34-47c130cedfe3","Type":"ContainerStarted","Data":"7904d9bb57160284dab511be9c76a167b8729833913a3cb5dde3dd507f8f3c35"} Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.649119 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.649141 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:06 crc kubenswrapper[4907]: I0226 16:05:06.700990 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7c4f4876c6-sk5mm" podStartSLOduration=4.70096824 podStartE2EDuration="4.70096824s" podCreationTimestamp="2026-02-26 16:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:06.693465181 +0000 UTC m=+1369.212027030" watchObservedRunningTime="2026-02-26 16:05:06.70096824 +0000 UTC m=+1369.219530089" Feb 26 16:05:07 crc kubenswrapper[4907]: I0226 16:05:07.667425 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/0.log" Feb 26 16:05:07 crc kubenswrapper[4907]: I0226 16:05:07.669200 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerStarted","Data":"f5b10bfa39ffbeb73ec5538efb9eedf69e6d0c41349242e8f25c8a54d368ca66"} Feb 26 16:05:07 crc kubenswrapper[4907]: I0226 16:05:07.670007 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:05:07 crc kubenswrapper[4907]: I0226 16:05:07.751393 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:05:08 crc kubenswrapper[4907]: I0226 16:05:08.155820 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:05:08 crc kubenswrapper[4907]: I0226 16:05:08.654362 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:05:08 crc kubenswrapper[4907]: I0226 16:05:08.786329 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ssd6q"] Feb 26 16:05:08 crc kubenswrapper[4907]: I0226 16:05:08.787545 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="dnsmasq-dns" containerID="cri-o://2f31d6369311bd67b490188786d5fc486c8f23a5573ac8fb19049224a8024306" gracePeriod=10 Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.464165 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.149:5353: connect: connection refused" Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.724202 4907 generic.go:334] "Generic (PLEG): container finished" podID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerID="2f31d6369311bd67b490188786d5fc486c8f23a5573ac8fb19049224a8024306" exitCode=0 Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.724275 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" event={"ID":"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2","Type":"ContainerDied","Data":"2f31d6369311bd67b490188786d5fc486c8f23a5573ac8fb19049224a8024306"} Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.726294 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/1.log" Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.726846 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/0.log" Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.730101 4907 generic.go:334] "Generic (PLEG): container finished" podID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerID="f5b10bfa39ffbeb73ec5538efb9eedf69e6d0c41349242e8f25c8a54d368ca66" exitCode=1 Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.730143 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerDied","Data":"f5b10bfa39ffbeb73ec5538efb9eedf69e6d0c41349242e8f25c8a54d368ca66"} Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.730174 4907 scope.go:117] "RemoveContainer" containerID="83485422366b12c8fef61540f7edc4a0ae20864be27c33bd1f6ef3405bd319b0" Feb 26 16:05:09 crc kubenswrapper[4907]: I0226 16:05:09.730863 4907 scope.go:117] "RemoveContainer" containerID="f5b10bfa39ffbeb73ec5538efb9eedf69e6d0c41349242e8f25c8a54d368ca66" Feb 26 16:05:09 crc kubenswrapper[4907]: E0226 16:05:09.731096 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 10s restarting failed container=neutron-httpd pod=neutron-6dbb49ff7b-8r7kc_openstack(41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3)\"" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" Feb 26 16:05:10 crc kubenswrapper[4907]: I0226 16:05:10.761506 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/1.log" Feb 26 16:05:10 crc kubenswrapper[4907]: I0226 16:05:10.942228 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.017616 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72x8p\" (UniqueName: \"kubernetes.io/projected/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-kube-api-access-72x8p\") pod \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.017742 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-config\") pod \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.017765 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-svc\") pod \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.017806 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-sb\") pod \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.017831 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-swift-storage-0\") pod \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.017981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-nb\") pod \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\" (UID: \"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2\") " Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.023058 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-kube-api-access-72x8p" (OuterVolumeSpecName: "kube-api-access-72x8p") pod "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" (UID: "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2"). InnerVolumeSpecName "kube-api-access-72x8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.066543 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" (UID: "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.067522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" (UID: "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.070698 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-config" (OuterVolumeSpecName: "config") pod "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" (UID: "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.084707 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" (UID: "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.084848 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" (UID: "16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.121237 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.121277 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72x8p\" (UniqueName: \"kubernetes.io/projected/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-kube-api-access-72x8p\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.121293 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.121304 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.121314 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.121325 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.778726 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" event={"ID":"16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2","Type":"ContainerDied","Data":"2f8ff938b8ca8578b1c2010cd415367806c641cbf907da82b020cb3994b6d420"} Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.778787 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-ssd6q" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.778791 4907 scope.go:117] "RemoveContainer" containerID="2f31d6369311bd67b490188786d5fc486c8f23a5573ac8fb19049224a8024306" Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.830185 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ssd6q"] Feb 26 16:05:11 crc kubenswrapper[4907]: I0226 16:05:11.869624 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-ssd6q"] Feb 26 16:05:12 crc kubenswrapper[4907]: I0226 16:05:12.139967 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" path="/var/lib/kubelet/pods/16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2/volumes" Feb 26 16:05:17 crc kubenswrapper[4907]: I0226 16:05:17.748729 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:05:18 crc kubenswrapper[4907]: I0226 16:05:18.154742 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:05:18 crc kubenswrapper[4907]: I0226 16:05:18.409036 4907 scope.go:117] "RemoveContainer" containerID="39335f4f2b14533ae9264b2cac3796ab9c192b8b3084a213715ab7bd87a34764" Feb 26 16:05:18 crc kubenswrapper[4907]: I0226 16:05:18.530567 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:05:18 crc kubenswrapper[4907]: I0226 16:05:18.530897 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.859281 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerStarted","Data":"9a3a1c6105cb7dc23bacbfc28b24ffeaba316b1f8c929f39ea03a8bb9a542443"} Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.859866 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.859477 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="sg-core" containerID="cri-o://8eafec6f37f7ac496b7287bbb919329db3d0805025d4c66fd6ae06dc3b992a3e" gracePeriod=30 Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.859456 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="proxy-httpd" containerID="cri-o://9a3a1c6105cb7dc23bacbfc28b24ffeaba316b1f8c929f39ea03a8bb9a542443" gracePeriod=30 Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.859429 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-central-agent" containerID="cri-o://6a86b7b8900988216a5e3f196d54892989a14bb69517093b0a2fb2792a439ae8" gracePeriod=30 Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.859489 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-notification-agent" containerID="cri-o://738b664b1aa529968ea7a0fe87f5d35158f6fc7d127775ad3c58c9db205eeeb8" gracePeriod=30 Feb 26 16:05:20 crc kubenswrapper[4907]: I0226 16:05:20.896823 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.685506588 podStartE2EDuration="1m22.896800829s" podCreationTimestamp="2026-02-26 16:03:58 +0000 UTC" firstStartedPulling="2026-02-26 16:04:00.466472627 +0000 UTC m=+1302.985034476" lastFinishedPulling="2026-02-26 16:05:19.677766818 +0000 UTC m=+1382.196328717" observedRunningTime="2026-02-26 16:05:20.890912719 +0000 UTC m=+1383.409474568" watchObservedRunningTime="2026-02-26 16:05:20.896800829 +0000 UTC m=+1383.415362678" Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873060 4907 generic.go:334] "Generic (PLEG): container finished" podID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerID="9a3a1c6105cb7dc23bacbfc28b24ffeaba316b1f8c929f39ea03a8bb9a542443" exitCode=0 Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873545 4907 generic.go:334] "Generic (PLEG): container finished" podID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerID="8eafec6f37f7ac496b7287bbb919329db3d0805025d4c66fd6ae06dc3b992a3e" exitCode=2 Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873557 4907 generic.go:334] "Generic (PLEG): container finished" podID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerID="738b664b1aa529968ea7a0fe87f5d35158f6fc7d127775ad3c58c9db205eeeb8" exitCode=0 Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873566 4907 generic.go:334] "Generic (PLEG): container finished" podID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerID="6a86b7b8900988216a5e3f196d54892989a14bb69517093b0a2fb2792a439ae8" exitCode=0 Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873151 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerDied","Data":"9a3a1c6105cb7dc23bacbfc28b24ffeaba316b1f8c929f39ea03a8bb9a542443"} Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873669 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerDied","Data":"8eafec6f37f7ac496b7287bbb919329db3d0805025d4c66fd6ae06dc3b992a3e"} Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873686 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerDied","Data":"738b664b1aa529968ea7a0fe87f5d35158f6fc7d127775ad3c58c9db205eeeb8"} Feb 26 16:05:21 crc kubenswrapper[4907]: I0226 16:05:21.873697 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerDied","Data":"6a86b7b8900988216a5e3f196d54892989a14bb69517093b0a2fb2792a439ae8"} Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.127441 4907 scope.go:117] "RemoveContainer" containerID="f5b10bfa39ffbeb73ec5538efb9eedf69e6d0c41349242e8f25c8a54d368ca66" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.406799 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.483402 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-combined-ca-bundle\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.483828 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-config-data\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.483980 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-log-httpd\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.484087 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-487fd\" (UniqueName: \"kubernetes.io/projected/429e4875-18c7-4a0a-bfea-135d7aec6ba0-kube-api-access-487fd\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.484193 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-sg-core-conf-yaml\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.484300 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-scripts\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.484398 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-run-httpd\") pod \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\" (UID: \"429e4875-18c7-4a0a-bfea-135d7aec6ba0\") " Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.486771 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.487712 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.503098 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429e4875-18c7-4a0a-bfea-135d7aec6ba0-kube-api-access-487fd" (OuterVolumeSpecName: "kube-api-access-487fd") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "kube-api-access-487fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.518264 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-scripts" (OuterVolumeSpecName: "scripts") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.534840 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.586873 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.586909 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-487fd\" (UniqueName: \"kubernetes.io/projected/429e4875-18c7-4a0a-bfea-135d7aec6ba0-kube-api-access-487fd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.586919 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.586927 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.586936 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/429e4875-18c7-4a0a-bfea-135d7aec6ba0-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.593127 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.661092 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-config-data" (OuterVolumeSpecName: "config-data") pod "429e4875-18c7-4a0a-bfea-135d7aec6ba0" (UID: "429e4875-18c7-4a0a-bfea-135d7aec6ba0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.688335 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.688365 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/429e4875-18c7-4a0a-bfea-135d7aec6ba0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.885504 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/1.log" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.886106 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerStarted","Data":"2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8"} Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.886353 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.890112 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"429e4875-18c7-4a0a-bfea-135d7aec6ba0","Type":"ContainerDied","Data":"0457368ebf2e749dc65e07a1276373b2f070582382c01ef1f135f773ce5e14af"} Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.890162 4907 scope.go:117] "RemoveContainer" containerID="9a3a1c6105cb7dc23bacbfc28b24ffeaba316b1f8c929f39ea03a8bb9a542443" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.890176 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.931200 4907 scope.go:117] "RemoveContainer" containerID="8eafec6f37f7ac496b7287bbb919329db3d0805025d4c66fd6ae06dc3b992a3e" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.935686 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.949729 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.965201 4907 scope.go:117] "RemoveContainer" containerID="738b664b1aa529968ea7a0fe87f5d35158f6fc7d127775ad3c58c9db205eeeb8" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.965523 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:22 crc kubenswrapper[4907]: E0226 16:05:22.966086 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="sg-core" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.966198 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="sg-core" Feb 26 16:05:22 crc kubenswrapper[4907]: E0226 16:05:22.966285 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="init" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.966865 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="init" Feb 26 16:05:22 crc kubenswrapper[4907]: E0226 16:05:22.966997 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="dnsmasq-dns" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.967072 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="dnsmasq-dns" Feb 26 16:05:22 crc kubenswrapper[4907]: E0226 16:05:22.967185 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-central-agent" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.967255 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-central-agent" Feb 26 16:05:22 crc kubenswrapper[4907]: E0226 16:05:22.967327 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="proxy-httpd" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.967394 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="proxy-httpd" Feb 26 16:05:22 crc kubenswrapper[4907]: E0226 16:05:22.967483 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-notification-agent" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.967555 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-notification-agent" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.967899 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="proxy-httpd" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.968004 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="16fe2d8c-c6a9-49dc-96a8-fa766cdb98d2" containerName="dnsmasq-dns" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.968087 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="sg-core" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.968165 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-notification-agent" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.968244 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" containerName="ceilometer-central-agent" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.970912 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.978626 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.987973 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:05:22 crc kubenswrapper[4907]: I0226 16:05:22.988209 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.035336 4907 scope.go:117] "RemoveContainer" containerID="6a86b7b8900988216a5e3f196d54892989a14bb69517093b0a2fb2792a439ae8" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.081105 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:23 crc kubenswrapper[4907]: E0226 16:05:23.082235 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-k2t2l log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="01679220-f521-4841-93a4-07a053b81ab1" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.096902 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.097155 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-log-httpd\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.097383 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-config-data\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.097623 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.097904 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-run-httpd\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.098263 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-scripts\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.098524 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2t2l\" (UniqueName: \"kubernetes.io/projected/01679220-f521-4841-93a4-07a053b81ab1-kube-api-access-k2t2l\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.200518 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.201732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-log-httpd\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.202359 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-log-httpd\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.202913 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-config-data\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.203483 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.203720 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-run-httpd\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.203895 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-scripts\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.204042 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2t2l\" (UniqueName: \"kubernetes.io/projected/01679220-f521-4841-93a4-07a053b81ab1-kube-api-access-k2t2l\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.204937 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-run-httpd\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.205310 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.206269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-config-data\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.208880 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-scripts\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.209425 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.221236 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2t2l\" (UniqueName: \"kubernetes.io/projected/01679220-f521-4841-93a4-07a053b81ab1-kube-api-access-k2t2l\") pod \"ceilometer-0\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.900026 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/2.log" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.900581 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/1.log" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.900964 4907 generic.go:334] "Generic (PLEG): container finished" podID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" exitCode=1 Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.901032 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerDied","Data":"2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8"} Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.901069 4907 scope.go:117] "RemoveContainer" containerID="f5b10bfa39ffbeb73ec5538efb9eedf69e6d0c41349242e8f25c8a54d368ca66" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.901769 4907 scope.go:117] "RemoveContainer" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:23 crc kubenswrapper[4907]: E0226 16:05:23.902031 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-6dbb49ff7b-8r7kc_openstack(41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3)\"" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.907744 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.945056 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.948709 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-api" probeResult="failure" output="Get \"http://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Feb 26 16:05:23 crc kubenswrapper[4907]: I0226 16:05:23.963247 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.119844 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-sg-core-conf-yaml\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120021 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-scripts\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120087 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-run-httpd\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120168 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2t2l\" (UniqueName: \"kubernetes.io/projected/01679220-f521-4841-93a4-07a053b81ab1-kube-api-access-k2t2l\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120256 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-config-data\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120315 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-combined-ca-bundle\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120358 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-log-httpd\") pod \"01679220-f521-4841-93a4-07a053b81ab1\" (UID: \"01679220-f521-4841-93a4-07a053b81ab1\") " Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.120973 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.121094 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.130639 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01679220-f521-4841-93a4-07a053b81ab1-kube-api-access-k2t2l" (OuterVolumeSpecName: "kube-api-access-k2t2l") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "kube-api-access-k2t2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.133722 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-config-data" (OuterVolumeSpecName: "config-data") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.135796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-scripts" (OuterVolumeSpecName: "scripts") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.137071 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.139739 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01679220-f521-4841-93a4-07a053b81ab1" (UID: "01679220-f521-4841-93a4-07a053b81ab1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.144436 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429e4875-18c7-4a0a-bfea-135d7aec6ba0" path="/var/lib/kubelet/pods/429e4875-18c7-4a0a-bfea-135d7aec6ba0/volumes" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223043 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223073 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223084 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223094 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223108 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01679220-f521-4841-93a4-07a053b81ab1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223117 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2t2l\" (UniqueName: \"kubernetes.io/projected/01679220-f521-4841-93a4-07a053b81ab1-kube-api-access-k2t2l\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.223125 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01679220-f521-4841-93a4-07a053b81ab1-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.916886 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/2.log" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.917364 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.917896 4907 scope.go:117] "RemoveContainer" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:24 crc kubenswrapper[4907]: E0226 16:05:24.918116 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-6dbb49ff7b-8r7kc_openstack(41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3)\"" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" Feb 26 16:05:24 crc kubenswrapper[4907]: I0226 16:05:24.995816 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.004828 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.015071 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.019309 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.027972 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.028304 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.071881 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154562 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154686 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154770 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpp5j\" (UniqueName: \"kubernetes.io/projected/9db5f721-707b-490c-917f-b3b2a85af07c-kube-api-access-tpp5j\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154795 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-log-httpd\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154838 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-scripts\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-run-httpd\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.154931 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-config-data\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258127 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-run-httpd\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258187 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-config-data\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258270 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258333 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258492 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpp5j\" (UniqueName: \"kubernetes.io/projected/9db5f721-707b-490c-917f-b3b2a85af07c-kube-api-access-tpp5j\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258531 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-log-httpd\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.258626 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-scripts\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.259174 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-run-httpd\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.259933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-log-httpd\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.265558 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.265794 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-scripts\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.266113 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-config-data\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.266807 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.296319 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpp5j\" (UniqueName: \"kubernetes.io/projected/9db5f721-707b-490c-917f-b3b2a85af07c-kube-api-access-tpp5j\") pod \"ceilometer-0\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.391241 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.923798 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.929084 4907 generic.go:334] "Generic (PLEG): container finished" podID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" containerID="e0735ae758d881be5c47962c25e0c5beb10fc882e59ecc8f9b0a5ceff33abcb6" exitCode=0 Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.929746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-slrvx" event={"ID":"a02d2622-77ed-4949-95b5-4f5ae5f1c47d","Type":"ContainerDied","Data":"e0735ae758d881be5c47962c25e0c5beb10fc882e59ecc8f9b0a5ceff33abcb6"} Feb 26 16:05:25 crc kubenswrapper[4907]: I0226 16:05:25.929861 4907 scope.go:117] "RemoveContainer" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:25 crc kubenswrapper[4907]: E0226 16:05:25.930093 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-6dbb49ff7b-8r7kc_openstack(41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3)\"" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.147190 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01679220-f521-4841-93a4-07a053b81ab1" path="/var/lib/kubelet/pods/01679220-f521-4841-93a4-07a053b81ab1/volumes" Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.847961 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8656797c97-kv5w2" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.849827 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-8656797c97-kv5w2" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.850578 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/neutron-8656797c97-kv5w2" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-api" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.940448 4907 generic.go:334] "Generic (PLEG): container finished" podID="c98fd629-273b-4c87-a07c-4a482064a5a3" containerID="0b6026eb615d38ba839f0ba2755147d5e3528ad58900bc626575611dfdbfdd95" exitCode=0 Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.940506 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xvvbl" event={"ID":"c98fd629-273b-4c87-a07c-4a482064a5a3","Type":"ContainerDied","Data":"0b6026eb615d38ba839f0ba2755147d5e3528ad58900bc626575611dfdbfdd95"} Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.949357 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerStarted","Data":"b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7"} Feb 26 16:05:26 crc kubenswrapper[4907]: I0226 16:05:26.949398 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerStarted","Data":"5a71a03405c136f01cda642c4e413f563b131353601cd16daa4d26321b46eaf1"} Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.335875 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-slrvx" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.383956 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vdjh\" (UniqueName: \"kubernetes.io/projected/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-kube-api-access-9vdjh\") pod \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.384037 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-db-sync-config-data\") pod \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.384159 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-combined-ca-bundle\") pod \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\" (UID: \"a02d2622-77ed-4949-95b5-4f5ae5f1c47d\") " Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.395908 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-kube-api-access-9vdjh" (OuterVolumeSpecName: "kube-api-access-9vdjh") pod "a02d2622-77ed-4949-95b5-4f5ae5f1c47d" (UID: "a02d2622-77ed-4949-95b5-4f5ae5f1c47d"). InnerVolumeSpecName "kube-api-access-9vdjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.402362 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a02d2622-77ed-4949-95b5-4f5ae5f1c47d" (UID: "a02d2622-77ed-4949-95b5-4f5ae5f1c47d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.422718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a02d2622-77ed-4949-95b5-4f5ae5f1c47d" (UID: "a02d2622-77ed-4949-95b5-4f5ae5f1c47d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.486159 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.486196 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vdjh\" (UniqueName: \"kubernetes.io/projected/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-kube-api-access-9vdjh\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.486211 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a02d2622-77ed-4949-95b5-4f5ae5f1c47d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.748696 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.749071 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.749930 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"5f606b9ab89532e105117c7cf76e6d48e275002733a615d726e58c1777c18aad"} pod="openstack/horizon-6fccfb8496-4tqhr" containerMessage="Container horizon failed startup probe, will be restarted" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.749977 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" containerID="cri-o://5f606b9ab89532e105117c7cf76e6d48e275002733a615d726e58c1777c18aad" gracePeriod=30 Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.959461 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-slrvx" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.959500 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-slrvx" event={"ID":"a02d2622-77ed-4949-95b5-4f5ae5f1c47d","Type":"ContainerDied","Data":"7f174eb188a2d21ce5510fcc0be89fb379aa859301ab35fd121402c3359f91d8"} Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.959543 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f174eb188a2d21ce5510fcc0be89fb379aa859301ab35fd121402c3359f91d8" Feb 26 16:05:27 crc kubenswrapper[4907]: I0226 16:05:27.961756 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerStarted","Data":"7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29"} Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.167199 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.167274 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.168094 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"c2b6ec3e96a2871e49421792b819e7d8811902b2acc4ebf5cb6213f4794ef38f"} pod="openstack/horizon-76d88967b8-wmzcw" containerMessage="Container horizon failed startup probe, will be restarted" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.168128 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" containerID="cri-o://c2b6ec3e96a2871e49421792b819e7d8811902b2acc4ebf5cb6213f4794ef38f" gracePeriod=30 Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.405453 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-d9b8ff5ff-b7kpr"] Feb 26 16:05:28 crc kubenswrapper[4907]: E0226 16:05:28.405918 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" containerName="barbican-db-sync" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.405935 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" containerName="barbican-db-sync" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.406161 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" containerName="barbican-db-sync" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.407261 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.419341 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.426733 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-fhpq9" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.437574 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.451459 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d9b8ff5ff-b7kpr"] Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.487656 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw"] Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.504921 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.505939 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.516123 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.527028 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ed716e-493d-4590-81a0-203b8618cf61-logs\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.527081 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bngp\" (UniqueName: \"kubernetes.io/projected/a0ed716e-493d-4590-81a0-203b8618cf61-kube-api-access-8bngp\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.527205 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-config-data-custom\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.527256 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-config-data\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.527342 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-combined-ca-bundle\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.544640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw"] Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.582794 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xqzwv"] Feb 26 16:05:28 crc kubenswrapper[4907]: E0226 16:05:28.583325 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98fd629-273b-4c87-a07c-4a482064a5a3" containerName="cinder-db-sync" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.583343 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98fd629-273b-4c87-a07c-4a482064a5a3" containerName="cinder-db-sync" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.583563 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98fd629-273b-4c87-a07c-4a482064a5a3" containerName="cinder-db-sync" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.584760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629173 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-scripts\") pod \"c98fd629-273b-4c87-a07c-4a482064a5a3\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629230 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-combined-ca-bundle\") pod \"c98fd629-273b-4c87-a07c-4a482064a5a3\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629340 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-db-sync-config-data\") pod \"c98fd629-273b-4c87-a07c-4a482064a5a3\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629367 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfcpk\" (UniqueName: \"kubernetes.io/projected/c98fd629-273b-4c87-a07c-4a482064a5a3-kube-api-access-sfcpk\") pod \"c98fd629-273b-4c87-a07c-4a482064a5a3\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629383 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c98fd629-273b-4c87-a07c-4a482064a5a3-etc-machine-id\") pod \"c98fd629-273b-4c87-a07c-4a482064a5a3\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629442 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-config-data\") pod \"c98fd629-273b-4c87-a07c-4a482064a5a3\" (UID: \"c98fd629-273b-4c87-a07c-4a482064a5a3\") " Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629760 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0449539-dbf4-4306-9dd9-db95f762a48a-logs\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629795 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629859 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klshp\" (UniqueName: \"kubernetes.io/projected/a0449539-dbf4-4306-9dd9-db95f762a48a-kube-api-access-klshp\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629888 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-config-data-custom\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbvsg\" (UniqueName: \"kubernetes.io/projected/5a61624f-5b40-447b-8da6-195ff8458f6a-kube-api-access-kbvsg\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629944 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.629977 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630010 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-config-data\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630028 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630049 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-config\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630084 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-config-data-custom\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630113 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-config-data\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630169 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-combined-ca-bundle\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630193 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ed716e-493d-4590-81a0-203b8618cf61-logs\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.630255 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bngp\" (UniqueName: \"kubernetes.io/projected/a0ed716e-493d-4590-81a0-203b8618cf61-kube-api-access-8bngp\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.638362 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0ed716e-493d-4590-81a0-203b8618cf61-logs\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.638455 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c98fd629-273b-4c87-a07c-4a482064a5a3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c98fd629-273b-4c87-a07c-4a482064a5a3" (UID: "c98fd629-273b-4c87-a07c-4a482064a5a3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.653686 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xqzwv"] Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.657879 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bngp\" (UniqueName: \"kubernetes.io/projected/a0ed716e-493d-4590-81a0-203b8618cf61-kube-api-access-8bngp\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.678915 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98fd629-273b-4c87-a07c-4a482064a5a3-kube-api-access-sfcpk" (OuterVolumeSpecName: "kube-api-access-sfcpk") pod "c98fd629-273b-4c87-a07c-4a482064a5a3" (UID: "c98fd629-273b-4c87-a07c-4a482064a5a3"). InnerVolumeSpecName "kube-api-access-sfcpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.679766 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-scripts" (OuterVolumeSpecName: "scripts") pod "c98fd629-273b-4c87-a07c-4a482064a5a3" (UID: "c98fd629-273b-4c87-a07c-4a482064a5a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.680376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-combined-ca-bundle\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.700497 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-config-data-custom\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.715699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c98fd629-273b-4c87-a07c-4a482064a5a3" (UID: "c98fd629-273b-4c87-a07c-4a482064a5a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.715760 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-766c5c4f46-9j8qd"] Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.717267 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.719386 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0ed716e-493d-4590-81a0-203b8618cf61-config-data\") pod \"barbican-worker-d9b8ff5ff-b7kpr\" (UID: \"a0ed716e-493d-4590-81a0-203b8618cf61\") " pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.722969 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-766c5c4f46-9j8qd"] Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.726464 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.730712 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c98fd629-273b-4c87-a07c-4a482064a5a3" (UID: "c98fd629-273b-4c87-a07c-4a482064a5a3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.732691 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.732827 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klshp\" (UniqueName: \"kubernetes.io/projected/a0449539-dbf4-4306-9dd9-db95f762a48a-kube-api-access-klshp\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.732866 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbvsg\" (UniqueName: \"kubernetes.io/projected/5a61624f-5b40-447b-8da6-195ff8458f6a-kube-api-access-kbvsg\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.732908 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.732935 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.732980 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-config\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-config-data-custom\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733090 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-config-data\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733173 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733240 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0449539-dbf4-4306-9dd9-db95f762a48a-logs\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733312 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733324 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733335 4907 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733344 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfcpk\" (UniqueName: \"kubernetes.io/projected/c98fd629-273b-4c87-a07c-4a482064a5a3-kube-api-access-sfcpk\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.733377 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c98fd629-273b-4c87-a07c-4a482064a5a3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.734050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0449539-dbf4-4306-9dd9-db95f762a48a-logs\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.736133 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.741127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.741911 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.742637 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.747341 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-config-data\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.750134 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-config\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.758305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-config-data-custom\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.758830 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0449539-dbf4-4306-9dd9-db95f762a48a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.771159 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbvsg\" (UniqueName: \"kubernetes.io/projected/5a61624f-5b40-447b-8da6-195ff8458f6a-kube-api-access-kbvsg\") pod \"dnsmasq-dns-848cf88cfc-xqzwv\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.772007 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.777518 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-config-data" (OuterVolumeSpecName: "config-data") pod "c98fd629-273b-4c87-a07c-4a482064a5a3" (UID: "c98fd629-273b-4c87-a07c-4a482064a5a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.786225 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klshp\" (UniqueName: \"kubernetes.io/projected/a0449539-dbf4-4306-9dd9-db95f762a48a-kube-api-access-klshp\") pod \"barbican-keystone-listener-7f8d9cb4c8-5jdnw\" (UID: \"a0449539-dbf4-4306-9dd9-db95f762a48a\") " pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.792373 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.847459 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data-custom\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.847627 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18111fe1-07d0-420e-bc61-457532bdb122-logs\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.847664 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx4kz\" (UniqueName: \"kubernetes.io/projected/18111fe1-07d0-420e-bc61-457532bdb122-kube-api-access-cx4kz\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.847762 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.847796 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-combined-ca-bundle\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.847864 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c98fd629-273b-4c87-a07c-4a482064a5a3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.948986 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.949309 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-combined-ca-bundle\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.949375 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data-custom\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.949490 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18111fe1-07d0-420e-bc61-457532bdb122-logs\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.949526 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx4kz\" (UniqueName: \"kubernetes.io/projected/18111fe1-07d0-420e-bc61-457532bdb122-kube-api-access-cx4kz\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.952182 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18111fe1-07d0-420e-bc61-457532bdb122-logs\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.956568 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-combined-ca-bundle\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.981169 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.981531 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data-custom\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:28 crc kubenswrapper[4907]: I0226 16:05:28.990920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx4kz\" (UniqueName: \"kubernetes.io/projected/18111fe1-07d0-420e-bc61-457532bdb122-kube-api-access-cx4kz\") pod \"barbican-api-766c5c4f46-9j8qd\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.049495 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xvvbl" event={"ID":"c98fd629-273b-4c87-a07c-4a482064a5a3","Type":"ContainerDied","Data":"b308d5b26b10adbd8d37d83e53aa48b2e59f9e627577cd43ab18b13bfa2bc4b7"} Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.049531 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b308d5b26b10adbd8d37d83e53aa48b2e59f9e627577cd43ab18b13bfa2bc4b7" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.049642 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xvvbl" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.064972 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.083171 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerStarted","Data":"59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d"} Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.113613 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.515656 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.517369 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.534258 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.537719 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rm5vl" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.537952 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.538072 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.562564 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xqzwv"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.579068 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.579181 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a140df23-061c-4941-855b-3c829a96d63e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.579209 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.579245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5cwj\" (UniqueName: \"kubernetes.io/projected/a140df23-061c-4941-855b-3c829a96d63e-kube-api-access-x5cwj\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.579307 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.579329 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.625729 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d9b8ff5ff-b7kpr"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.640433 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.668033 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-khgm9"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.669825 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.684810 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.684856 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.684915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.684979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a140df23-061c-4941-855b-3c829a96d63e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.685000 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.685041 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5cwj\" (UniqueName: \"kubernetes.io/projected/a140df23-061c-4941-855b-3c829a96d63e-kube-api-access-x5cwj\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.685427 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a140df23-061c-4941-855b-3c829a96d63e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.697639 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.705623 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.718452 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.736824 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-khgm9"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.738253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5cwj\" (UniqueName: \"kubernetes.io/projected/a140df23-061c-4941-855b-3c829a96d63e-kube-api-access-x5cwj\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.744049 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-scripts\") pod \"cinder-scheduler-0\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.790734 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-config\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.790792 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.790829 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-svc\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.791126 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s42sx\" (UniqueName: \"kubernetes.io/projected/aa93ec18-05b5-4814-989a-ec50a85bba83-kube-api-access-s42sx\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.791222 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.791371 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.861204 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.895869 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.895948 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-config\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.895985 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.896014 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-svc\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.896068 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s42sx\" (UniqueName: \"kubernetes.io/projected/aa93ec18-05b5-4814-989a-ec50a85bba83-kube-api-access-s42sx\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.896094 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.897080 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.900209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.910183 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-svc\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.913347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-config\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.916500 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.957185 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xqzwv"] Feb 26 16:05:29 crc kubenswrapper[4907]: I0226 16:05:29.972727 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s42sx\" (UniqueName: \"kubernetes.io/projected/aa93ec18-05b5-4814-989a-ec50a85bba83-kube-api-access-s42sx\") pod \"dnsmasq-dns-6578955fd5-khgm9\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.046679 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw"] Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.065741 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.074727 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.090958 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.094559 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.106901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.106956 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-logs\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.107000 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.107029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w58f8\" (UniqueName: \"kubernetes.io/projected/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-kube-api-access-w58f8\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.107044 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data-custom\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.107063 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.107095 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-scripts\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.152621 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" event={"ID":"a0ed716e-493d-4590-81a0-203b8618cf61","Type":"ContainerStarted","Data":"f42400255501cf21da65d2def77d45b13c2e17e15d70bc40abc2ae1de7dccb66"} Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.152989 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" event={"ID":"a0449539-dbf4-4306-9dd9-db95f762a48a","Type":"ContainerStarted","Data":"c859f15b9135751edf96f0c6f18b09055910b3de0976ca63c60af913a8a30b6c"} Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.153355 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.180342 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" event={"ID":"5a61624f-5b40-447b-8da6-195ff8458f6a","Type":"ContainerStarted","Data":"e783f7460eed4efee9df16a26574b2e15ad512b10b07276e884e9c61f0e39788"} Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209656 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209712 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-logs\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209765 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209799 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w58f8\" (UniqueName: \"kubernetes.io/projected/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-kube-api-access-w58f8\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data-custom\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209847 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.209893 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-scripts\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.210732 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.213314 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-logs\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.216535 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.222224 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data-custom\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.222542 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-scripts\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.230253 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.244008 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w58f8\" (UniqueName: \"kubernetes.io/projected/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-kube-api-access-w58f8\") pod \"cinder-api-0\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.299188 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-766c5c4f46-9j8qd"] Feb 26 16:05:30 crc kubenswrapper[4907]: W0226 16:05:30.318794 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18111fe1_07d0_420e_bc61_457532bdb122.slice/crio-6a521e2c83616303952eea4e1e6b6b39bd53bb69c8fd4e4d715a564a7419be77 WatchSource:0}: Error finding container 6a521e2c83616303952eea4e1e6b6b39bd53bb69c8fd4e4d715a564a7419be77: Status 404 returned error can't find the container with id 6a521e2c83616303952eea4e1e6b6b39bd53bb69c8fd4e4d715a564a7419be77 Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.424192 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.669559 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:30 crc kubenswrapper[4907]: W0226 16:05:30.708611 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda140df23_061c_4941_855b_3c829a96d63e.slice/crio-9b600d5ab8673077a9cf46fc59d20a9da044aa282fff1f10137aff37b15eb713 WatchSource:0}: Error finding container 9b600d5ab8673077a9cf46fc59d20a9da044aa282fff1f10137aff37b15eb713: Status 404 returned error can't find the container with id 9b600d5ab8673077a9cf46fc59d20a9da044aa282fff1f10137aff37b15eb713 Feb 26 16:05:30 crc kubenswrapper[4907]: I0226 16:05:30.972148 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-khgm9"] Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.231823 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" event={"ID":"aa93ec18-05b5-4814-989a-ec50a85bba83","Type":"ContainerStarted","Data":"b98a2e97db98747c096fa8cb29eef9699b999b504ca42b9b1797c7464f9a5c94"} Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.237267 4907 generic.go:334] "Generic (PLEG): container finished" podID="5a61624f-5b40-447b-8da6-195ff8458f6a" containerID="ad637159f8c6a05c8686a911f462a35e84da11fe3ac09c55da3c89e478501f3d" exitCode=0 Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.237495 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" event={"ID":"5a61624f-5b40-447b-8da6-195ff8458f6a","Type":"ContainerDied","Data":"ad637159f8c6a05c8686a911f462a35e84da11fe3ac09c55da3c89e478501f3d"} Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.255036 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a140df23-061c-4941-855b-3c829a96d63e","Type":"ContainerStarted","Data":"9b600d5ab8673077a9cf46fc59d20a9da044aa282fff1f10137aff37b15eb713"} Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.260932 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766c5c4f46-9j8qd" event={"ID":"18111fe1-07d0-420e-bc61-457532bdb122","Type":"ContainerStarted","Data":"8ca2eaf129ea9f72949ee76c0734571655cd2f8eaf2c2646647fae90c038305a"} Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.260992 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766c5c4f46-9j8qd" event={"ID":"18111fe1-07d0-420e-bc61-457532bdb122","Type":"ContainerStarted","Data":"6a521e2c83616303952eea4e1e6b6b39bd53bb69c8fd4e4d715a564a7419be77"} Feb 26 16:05:31 crc kubenswrapper[4907]: W0226 16:05:31.279398 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod466a75e1_c85d_4d33_b9c7_6916eca1ebe1.slice/crio-e8ababb499c81f65ef140ef8e984f39a8a7bad3f400ba836d1870012d035b066 WatchSource:0}: Error finding container e8ababb499c81f65ef140ef8e984f39a8a7bad3f400ba836d1870012d035b066: Status 404 returned error can't find the container with id e8ababb499c81f65ef140ef8e984f39a8a7bad3f400ba836d1870012d035b066 Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.288654 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.709059 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.853033 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-nb\") pod \"5a61624f-5b40-447b-8da6-195ff8458f6a\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.853091 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-swift-storage-0\") pod \"5a61624f-5b40-447b-8da6-195ff8458f6a\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.853202 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbvsg\" (UniqueName: \"kubernetes.io/projected/5a61624f-5b40-447b-8da6-195ff8458f6a-kube-api-access-kbvsg\") pod \"5a61624f-5b40-447b-8da6-195ff8458f6a\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.853253 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-config\") pod \"5a61624f-5b40-447b-8da6-195ff8458f6a\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.853298 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-svc\") pod \"5a61624f-5b40-447b-8da6-195ff8458f6a\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.853343 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-sb\") pod \"5a61624f-5b40-447b-8da6-195ff8458f6a\" (UID: \"5a61624f-5b40-447b-8da6-195ff8458f6a\") " Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.875827 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a61624f-5b40-447b-8da6-195ff8458f6a-kube-api-access-kbvsg" (OuterVolumeSpecName: "kube-api-access-kbvsg") pod "5a61624f-5b40-447b-8da6-195ff8458f6a" (UID: "5a61624f-5b40-447b-8da6-195ff8458f6a"). InnerVolumeSpecName "kube-api-access-kbvsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.880183 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a61624f-5b40-447b-8da6-195ff8458f6a" (UID: "5a61624f-5b40-447b-8da6-195ff8458f6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.882941 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5a61624f-5b40-447b-8da6-195ff8458f6a" (UID: "5a61624f-5b40-447b-8da6-195ff8458f6a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.887207 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a61624f-5b40-447b-8da6-195ff8458f6a" (UID: "5a61624f-5b40-447b-8da6-195ff8458f6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.914085 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a61624f-5b40-447b-8da6-195ff8458f6a" (UID: "5a61624f-5b40-447b-8da6-195ff8458f6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.918278 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-config" (OuterVolumeSpecName: "config") pod "5a61624f-5b40-447b-8da6-195ff8458f6a" (UID: "5a61624f-5b40-447b-8da6-195ff8458f6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.955099 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbvsg\" (UniqueName: \"kubernetes.io/projected/5a61624f-5b40-447b-8da6-195ff8458f6a-kube-api-access-kbvsg\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.955135 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.955144 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.955153 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.955162 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:31 crc kubenswrapper[4907]: I0226 16:05:31.955170 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a61624f-5b40-447b-8da6-195ff8458f6a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:32 crc kubenswrapper[4907]: I0226 16:05:32.271666 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" Feb 26 16:05:32 crc kubenswrapper[4907]: I0226 16:05:32.271666 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-xqzwv" event={"ID":"5a61624f-5b40-447b-8da6-195ff8458f6a","Type":"ContainerDied","Data":"e783f7460eed4efee9df16a26574b2e15ad512b10b07276e884e9c61f0e39788"} Feb 26 16:05:32 crc kubenswrapper[4907]: I0226 16:05:32.271822 4907 scope.go:117] "RemoveContainer" containerID="ad637159f8c6a05c8686a911f462a35e84da11fe3ac09c55da3c89e478501f3d" Feb 26 16:05:32 crc kubenswrapper[4907]: I0226 16:05:32.273377 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466a75e1-c85d-4d33-b9c7-6916eca1ebe1","Type":"ContainerStarted","Data":"e8ababb499c81f65ef140ef8e984f39a8a7bad3f400ba836d1870012d035b066"} Feb 26 16:05:32 crc kubenswrapper[4907]: I0226 16:05:32.323775 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xqzwv"] Feb 26 16:05:32 crc kubenswrapper[4907]: I0226 16:05:32.345849 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-xqzwv"] Feb 26 16:05:33 crc kubenswrapper[4907]: I0226 16:05:33.438623 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:05:34 crc kubenswrapper[4907]: I0226 16:05:34.151315 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a61624f-5b40-447b-8da6-195ff8458f6a" path="/var/lib/kubelet/pods/5a61624f-5b40-447b-8da6-195ff8458f6a/volumes" Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.368220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" event={"ID":"a0449539-dbf4-4306-9dd9-db95f762a48a","Type":"ContainerStarted","Data":"163d86adcbe2fd0356ce74f95b7c84397f81bfb771ccc081d0ea78590fecd6aa"} Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.369574 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766c5c4f46-9j8qd" event={"ID":"18111fe1-07d0-420e-bc61-457532bdb122","Type":"ContainerStarted","Data":"f4989c8a6447adef0894aa6de4de8e66f5c50e42f779fcb39c2666777f3c7e46"} Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.370735 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.370885 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.372467 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466a75e1-c85d-4d33-b9c7-6916eca1ebe1","Type":"ContainerStarted","Data":"ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31"} Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.373964 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" event={"ID":"a0ed716e-493d-4590-81a0-203b8618cf61","Type":"ContainerStarted","Data":"47928511ab321368b6e6eb62d7ad350aa8a1ea634f416b4963183ea3b04a334d"} Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.376480 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerStarted","Data":"ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717"} Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.377298 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.380092 4907 generic.go:334] "Generic (PLEG): container finished" podID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerID="d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c" exitCode=0 Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.380129 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" event={"ID":"aa93ec18-05b5-4814-989a-ec50a85bba83","Type":"ContainerDied","Data":"d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c"} Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.387328 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-766c5c4f46-9j8qd" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": dial tcp 10.217.0.168:9311: connect: connection refused" Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.437505 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.495809144 podStartE2EDuration="11.437484028s" podCreationTimestamp="2026-02-26 16:05:24 +0000 UTC" firstStartedPulling="2026-02-26 16:05:25.92695369 +0000 UTC m=+1388.445515539" lastFinishedPulling="2026-02-26 16:05:32.868628574 +0000 UTC m=+1395.387190423" observedRunningTime="2026-02-26 16:05:35.431243108 +0000 UTC m=+1397.949804987" watchObservedRunningTime="2026-02-26 16:05:35.437484028 +0000 UTC m=+1397.956045877" Feb 26 16:05:35 crc kubenswrapper[4907]: I0226 16:05:35.442163 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-766c5c4f46-9j8qd" podStartSLOduration=7.442147859 podStartE2EDuration="7.442147859s" podCreationTimestamp="2026-02-26 16:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:35.407773848 +0000 UTC m=+1397.926335707" watchObservedRunningTime="2026-02-26 16:05:35.442147859 +0000 UTC m=+1397.960709718" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.291567 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.299140 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7c4f4876c6-sk5mm" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.407902 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6f5746579b-4xjhs"] Feb 26 16:05:36 crc kubenswrapper[4907]: E0226 16:05:36.408305 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a61624f-5b40-447b-8da6-195ff8458f6a" containerName="init" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.408316 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a61624f-5b40-447b-8da6-195ff8458f6a" containerName="init" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.408498 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a61624f-5b40-447b-8da6-195ff8458f6a" containerName="init" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.409376 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.417722 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.417969 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.439270 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f5746579b-4xjhs"] Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.440110 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a140df23-061c-4941-855b-3c829a96d63e","Type":"ContainerStarted","Data":"fc2090ab19285a87d4abfaf72a5de580432719b47c4f5656ad92e60309f9e518"} Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.445705 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466a75e1-c85d-4d33-b9c7-6916eca1ebe1","Type":"ContainerStarted","Data":"ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317"} Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.446091 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api-log" containerID="cri-o://ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31" gracePeriod=30 Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.446164 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api" containerID="cri-o://ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317" gracePeriod=30 Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.446533 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.462555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" event={"ID":"a0ed716e-493d-4590-81a0-203b8618cf61","Type":"ContainerStarted","Data":"71c36525f37133bbfaba01c15e555bfba45e37e6c81ac82de330e4f035e1e12f"} Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.495814 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" event={"ID":"aa93ec18-05b5-4814-989a-ec50a85bba83","Type":"ContainerStarted","Data":"7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256"} Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.499473 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.532644 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" event={"ID":"a0449539-dbf4-4306-9dd9-db95f762a48a","Type":"ContainerStarted","Data":"6d8e589ee3baa3ff6a981071a245f612d4b042eb42933a0fc2641609f5b0728e"} Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.541736 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=7.541718 podStartE2EDuration="7.541718s" podCreationTimestamp="2026-02-26 16:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:36.468611096 +0000 UTC m=+1398.987172965" watchObservedRunningTime="2026-02-26 16:05:36.541718 +0000 UTC m=+1399.060279849" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.559936 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-d9b8ff5ff-b7kpr" podStartSLOduration=3.985110605 podStartE2EDuration="8.559916564s" podCreationTimestamp="2026-02-26 16:05:28 +0000 UTC" firstStartedPulling="2026-02-26 16:05:29.593020809 +0000 UTC m=+1392.111582658" lastFinishedPulling="2026-02-26 16:05:34.167826768 +0000 UTC m=+1396.686388617" observedRunningTime="2026-02-26 16:05:36.499888082 +0000 UTC m=+1399.018449951" watchObservedRunningTime="2026-02-26 16:05:36.559916564 +0000 UTC m=+1399.078478413" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578339 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29k6k\" (UniqueName: \"kubernetes.io/projected/f81805f8-b496-452b-b721-2861546c9367-kube-api-access-29k6k\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-config-data\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578485 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-internal-tls-certs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-combined-ca-bundle\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f81805f8-b496-452b-b721-2861546c9367-logs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578659 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-public-tls-certs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.578747 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-config-data-custom\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708027 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f81805f8-b496-452b-b721-2861546c9367-logs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708122 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-public-tls-certs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708212 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-config-data-custom\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708249 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29k6k\" (UniqueName: \"kubernetes.io/projected/f81805f8-b496-452b-b721-2861546c9367-kube-api-access-29k6k\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708318 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-config-data\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-internal-tls-certs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.708448 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-combined-ca-bundle\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.709108 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" podStartSLOduration=7.709085013 podStartE2EDuration="7.709085013s" podCreationTimestamp="2026-02-26 16:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:36.535546833 +0000 UTC m=+1399.054108682" watchObservedRunningTime="2026-02-26 16:05:36.709085013 +0000 UTC m=+1399.227646882" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.726231 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f81805f8-b496-452b-b721-2861546c9367-logs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.747884 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-combined-ca-bundle\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.769867 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-config-data\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.776996 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f8d9cb4c8-5jdnw" podStartSLOduration=4.576994527 podStartE2EDuration="8.776968492s" podCreationTimestamp="2026-02-26 16:05:28 +0000 UTC" firstStartedPulling="2026-02-26 16:05:29.996815703 +0000 UTC m=+1392.515377552" lastFinishedPulling="2026-02-26 16:05:34.196789668 +0000 UTC m=+1396.715351517" observedRunningTime="2026-02-26 16:05:36.576847058 +0000 UTC m=+1399.095408907" watchObservedRunningTime="2026-02-26 16:05:36.776968492 +0000 UTC m=+1399.295530341" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.780165 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-public-tls-certs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.790781 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-internal-tls-certs\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.797557 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f81805f8-b496-452b-b721-2861546c9367-config-data-custom\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:36 crc kubenswrapper[4907]: I0226 16:05:36.810994 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29k6k\" (UniqueName: \"kubernetes.io/projected/f81805f8-b496-452b-b721-2861546c9367-kube-api-access-29k6k\") pod \"barbican-api-6f5746579b-4xjhs\" (UID: \"f81805f8-b496-452b-b721-2861546c9367\") " pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:37 crc kubenswrapper[4907]: I0226 16:05:37.060109 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:37 crc kubenswrapper[4907]: I0226 16:05:37.539699 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a140df23-061c-4941-855b-3c829a96d63e","Type":"ContainerStarted","Data":"a625102b2c1469077b1b807e98392aef0565b381162ae06153ceb15d285d8550"} Feb 26 16:05:37 crc kubenswrapper[4907]: I0226 16:05:37.541619 4907 generic.go:334] "Generic (PLEG): container finished" podID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerID="ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31" exitCode=143 Feb 26 16:05:37 crc kubenswrapper[4907]: I0226 16:05:37.542385 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466a75e1-c85d-4d33-b9c7-6916eca1ebe1","Type":"ContainerDied","Data":"ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31"} Feb 26 16:05:37 crc kubenswrapper[4907]: I0226 16:05:37.689122 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.233927045 podStartE2EDuration="8.689102093s" podCreationTimestamp="2026-02-26 16:05:29 +0000 UTC" firstStartedPulling="2026-02-26 16:05:30.71307856 +0000 UTC m=+1393.231640409" lastFinishedPulling="2026-02-26 16:05:34.168253608 +0000 UTC m=+1396.686815457" observedRunningTime="2026-02-26 16:05:37.569694123 +0000 UTC m=+1400.088255972" watchObservedRunningTime="2026-02-26 16:05:37.689102093 +0000 UTC m=+1400.207663942" Feb 26 16:05:37 crc kubenswrapper[4907]: I0226 16:05:37.702351 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6f5746579b-4xjhs"] Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.245390 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-86f7f47947-xzhlh" Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.555452 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f5746579b-4xjhs" event={"ID":"f81805f8-b496-452b-b721-2861546c9367","Type":"ContainerStarted","Data":"579bd3cfa8e8fb25759160b1f26c895a25492b21308bf87c124d0681027f61ba"} Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.555774 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f5746579b-4xjhs" event={"ID":"f81805f8-b496-452b-b721-2861546c9367","Type":"ContainerStarted","Data":"c21cb9d1cff3d0680c3f0768e1d522c8986401a63a22fc1183f78ef47bf033d8"} Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.555788 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6f5746579b-4xjhs" event={"ID":"f81805f8-b496-452b-b721-2861546c9367","Type":"ContainerStarted","Data":"4c79c4ff5255991a1e2e6fcbdd69f075123a504e94b7403ab414a0ce15250106"} Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.555819 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.555835 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:38 crc kubenswrapper[4907]: I0226 16:05:38.577299 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6f5746579b-4xjhs" podStartSLOduration=2.577275571 podStartE2EDuration="2.577275571s" podCreationTimestamp="2026-02-26 16:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:38.575089708 +0000 UTC m=+1401.093651557" watchObservedRunningTime="2026-02-26 16:05:38.577275571 +0000 UTC m=+1401.095837420" Feb 26 16:05:39 crc kubenswrapper[4907]: I0226 16:05:39.127390 4907 scope.go:117] "RemoveContainer" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:39 crc kubenswrapper[4907]: E0226 16:05:39.127572 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"neutron-httpd\" with CrashLoopBackOff: \"back-off 20s restarting failed container=neutron-httpd pod=neutron-6dbb49ff7b-8r7kc_openstack(41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3)\"" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" Feb 26 16:05:39 crc kubenswrapper[4907]: I0226 16:05:39.862582 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.155649 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.243616 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qdsb5"] Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.244055 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" podUID="72c07a62-59c5-47d0-8c74-766322267226" containerName="dnsmasq-dns" containerID="cri-o://d725e999a2b38af5b26a31d0df3f5b6ff6575a18dd3299c2b1a572449d67e5c5" gracePeriod=10 Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.584865 4907 generic.go:334] "Generic (PLEG): container finished" podID="72c07a62-59c5-47d0-8c74-766322267226" containerID="d725e999a2b38af5b26a31d0df3f5b6ff6575a18dd3299c2b1a572449d67e5c5" exitCode=0 Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.584943 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" event={"ID":"72c07a62-59c5-47d0-8c74-766322267226","Type":"ContainerDied","Data":"d725e999a2b38af5b26a31d0df3f5b6ff6575a18dd3299c2b1a572449d67e5c5"} Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.856980 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.914110 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns7zc\" (UniqueName: \"kubernetes.io/projected/72c07a62-59c5-47d0-8c74-766322267226-kube-api-access-ns7zc\") pod \"72c07a62-59c5-47d0-8c74-766322267226\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.914206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-svc\") pod \"72c07a62-59c5-47d0-8c74-766322267226\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.914319 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-sb\") pod \"72c07a62-59c5-47d0-8c74-766322267226\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.914415 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-config\") pod \"72c07a62-59c5-47d0-8c74-766322267226\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.914443 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-swift-storage-0\") pod \"72c07a62-59c5-47d0-8c74-766322267226\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.915037 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-nb\") pod \"72c07a62-59c5-47d0-8c74-766322267226\" (UID: \"72c07a62-59c5-47d0-8c74-766322267226\") " Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.958279 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c07a62-59c5-47d0-8c74-766322267226-kube-api-access-ns7zc" (OuterVolumeSpecName: "kube-api-access-ns7zc") pod "72c07a62-59c5-47d0-8c74-766322267226" (UID: "72c07a62-59c5-47d0-8c74-766322267226"). InnerVolumeSpecName "kube-api-access-ns7zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:40 crc kubenswrapper[4907]: I0226 16:05:40.996938 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-config" (OuterVolumeSpecName: "config") pod "72c07a62-59c5-47d0-8c74-766322267226" (UID: "72c07a62-59c5-47d0-8c74-766322267226"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.008056 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "72c07a62-59c5-47d0-8c74-766322267226" (UID: "72c07a62-59c5-47d0-8c74-766322267226"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.016794 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.016822 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.016831 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns7zc\" (UniqueName: \"kubernetes.io/projected/72c07a62-59c5-47d0-8c74-766322267226-kube-api-access-ns7zc\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.021904 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "72c07a62-59c5-47d0-8c74-766322267226" (UID: "72c07a62-59c5-47d0-8c74-766322267226"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.044102 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "72c07a62-59c5-47d0-8c74-766322267226" (UID: "72c07a62-59c5-47d0-8c74-766322267226"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.055377 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "72c07a62-59c5-47d0-8c74-766322267226" (UID: "72c07a62-59c5-47d0-8c74-766322267226"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.118316 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.118356 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.118368 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72c07a62-59c5-47d0-8c74-766322267226-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.484808 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.603123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" event={"ID":"72c07a62-59c5-47d0-8c74-766322267226","Type":"ContainerDied","Data":"88ea84ff9ae452e264570e7cc71bebc39d32b787300de3745d8d1fea1e2ee95e"} Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.603185 4907 scope.go:117] "RemoveContainer" containerID="d725e999a2b38af5b26a31d0df3f5b6ff6575a18dd3299c2b1a572449d67e5c5" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.603348 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-qdsb5" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.639169 4907 scope.go:117] "RemoveContainer" containerID="f0eb829c22e21a48b9e9adf06599e6d98e845d62ff5475408399a9a5d9f46967" Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.645777 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qdsb5"] Feb 26 16:05:41 crc kubenswrapper[4907]: I0226 16:05:41.702782 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-qdsb5"] Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.136123 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c07a62-59c5-47d0-8c74-766322267226" path="/var/lib/kubelet/pods/72c07a62-59c5-47d0-8c74-766322267226/volumes" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.500859 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 16:05:42 crc kubenswrapper[4907]: E0226 16:05:42.501414 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c07a62-59c5-47d0-8c74-766322267226" containerName="init" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.501426 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c07a62-59c5-47d0-8c74-766322267226" containerName="init" Feb 26 16:05:42 crc kubenswrapper[4907]: E0226 16:05:42.501442 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c07a62-59c5-47d0-8c74-766322267226" containerName="dnsmasq-dns" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.501448 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c07a62-59c5-47d0-8c74-766322267226" containerName="dnsmasq-dns" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.501610 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c07a62-59c5-47d0-8c74-766322267226" containerName="dnsmasq-dns" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.502148 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.509020 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.509078 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.510084 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6njs5" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.520451 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.545021 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-openstack-config-secret\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.545146 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.545191 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25sw\" (UniqueName: \"kubernetes.io/projected/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-kube-api-access-b25sw\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.545236 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-openstack-config\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.647198 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.647269 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25sw\" (UniqueName: \"kubernetes.io/projected/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-kube-api-access-b25sw\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.647314 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-openstack-config\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.647412 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-openstack-config-secret\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.648370 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-openstack-config\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.655787 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.656785 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-openstack-config-secret\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.678202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25sw\" (UniqueName: \"kubernetes.io/projected/173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa-kube-api-access-b25sw\") pod \"openstackclient\" (UID: \"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa\") " pod="openstack/openstackclient" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.734243 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:42 crc kubenswrapper[4907]: I0226 16:05:42.827367 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 16:05:43 crc kubenswrapper[4907]: I0226 16:05:43.582061 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 16:05:43 crc kubenswrapper[4907]: W0226 16:05:43.608099 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod173e1a27_c6cc_47cf_9d1a_8e9e19fe3afa.slice/crio-29ee6a7b6367e1a9af3d9c61e5618b1d4f6e54b5194041e48c3349bb43b261a1 WatchSource:0}: Error finding container 29ee6a7b6367e1a9af3d9c61e5618b1d4f6e54b5194041e48c3349bb43b261a1: Status 404 returned error can't find the container with id 29ee6a7b6367e1a9af3d9c61e5618b1d4f6e54b5194041e48c3349bb43b261a1 Feb 26 16:05:43 crc kubenswrapper[4907]: I0226 16:05:43.632605 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa","Type":"ContainerStarted","Data":"29ee6a7b6367e1a9af3d9c61e5618b1d4f6e54b5194041e48c3349bb43b261a1"} Feb 26 16:05:45 crc kubenswrapper[4907]: I0226 16:05:45.186602 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 16:05:45 crc kubenswrapper[4907]: I0226 16:05:45.271075 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:45 crc kubenswrapper[4907]: I0226 16:05:45.662013 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="cinder-scheduler" containerID="cri-o://fc2090ab19285a87d4abfaf72a5de580432719b47c4f5656ad92e60309f9e518" gracePeriod=30 Feb 26 16:05:45 crc kubenswrapper[4907]: I0226 16:05:45.662083 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="probe" containerID="cri-o://a625102b2c1469077b1b807e98392aef0565b381162ae06153ceb15d285d8550" gracePeriod=30 Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.677286 4907 generic.go:334] "Generic (PLEG): container finished" podID="a140df23-061c-4941-855b-3c829a96d63e" containerID="a625102b2c1469077b1b807e98392aef0565b381162ae06153ceb15d285d8550" exitCode=0 Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.677561 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a140df23-061c-4941-855b-3c829a96d63e","Type":"ContainerDied","Data":"a625102b2c1469077b1b807e98392aef0565b381162ae06153ceb15d285d8550"} Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.760736 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.761048 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-central-agent" containerID="cri-o://b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7" gracePeriod=30 Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.761126 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="sg-core" containerID="cri-o://59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d" gracePeriod=30 Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.761152 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="proxy-httpd" containerID="cri-o://ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717" gracePeriod=30 Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.761503 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-notification-agent" containerID="cri-o://7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29" gracePeriod=30 Feb 26 16:05:46 crc kubenswrapper[4907]: I0226 16:05:46.821624 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.164:3000/\": read tcp 10.217.0.2:56450->10.217.0.164:3000: read: connection reset by peer" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.175861 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-58d5d7785f-4fcrq"] Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.177419 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.179526 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.179825 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.180662 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.208766 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58d5d7785f-4fcrq"] Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cd7152-934f-40c6-925c-a3f1f9dfca95-log-httpd\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232601 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fmw9\" (UniqueName: \"kubernetes.io/projected/80cd7152-934f-40c6-925c-a3f1f9dfca95-kube-api-access-8fmw9\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232619 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80cd7152-934f-40c6-925c-a3f1f9dfca95-etc-swift\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232706 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-combined-ca-bundle\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232733 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-public-tls-certs\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232750 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cd7152-934f-40c6-925c-a3f1f9dfca95-run-httpd\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232765 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-config-data\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.232845 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-internal-tls-certs\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336096 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fmw9\" (UniqueName: \"kubernetes.io/projected/80cd7152-934f-40c6-925c-a3f1f9dfca95-kube-api-access-8fmw9\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336154 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80cd7152-934f-40c6-925c-a3f1f9dfca95-etc-swift\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-combined-ca-bundle\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336288 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-public-tls-certs\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336312 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cd7152-934f-40c6-925c-a3f1f9dfca95-run-httpd\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336334 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-config-data\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336399 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-internal-tls-certs\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336437 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cd7152-934f-40c6-925c-a3f1f9dfca95-log-httpd\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.336988 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cd7152-934f-40c6-925c-a3f1f9dfca95-log-httpd\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.337246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/80cd7152-934f-40c6-925c-a3f1f9dfca95-run-httpd\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.348757 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-config-data\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.351979 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-public-tls-certs\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.355316 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-internal-tls-certs\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.355648 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/80cd7152-934f-40c6-925c-a3f1f9dfca95-etc-swift\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.368978 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80cd7152-934f-40c6-925c-a3f1f9dfca95-combined-ca-bundle\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.387839 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fmw9\" (UniqueName: \"kubernetes.io/projected/80cd7152-934f-40c6-925c-a3f1f9dfca95-kube-api-access-8fmw9\") pod \"swift-proxy-58d5d7785f-4fcrq\" (UID: \"80cd7152-934f-40c6-925c-a3f1f9dfca95\") " pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.495997 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.698401 4907 generic.go:334] "Generic (PLEG): container finished" podID="a140df23-061c-4941-855b-3c829a96d63e" containerID="fc2090ab19285a87d4abfaf72a5de580432719b47c4f5656ad92e60309f9e518" exitCode=0 Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.698480 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a140df23-061c-4941-855b-3c829a96d63e","Type":"ContainerDied","Data":"fc2090ab19285a87d4abfaf72a5de580432719b47c4f5656ad92e60309f9e518"} Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.707120 4907 generic.go:334] "Generic (PLEG): container finished" podID="9db5f721-707b-490c-917f-b3b2a85af07c" containerID="ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717" exitCode=0 Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.707158 4907 generic.go:334] "Generic (PLEG): container finished" podID="9db5f721-707b-490c-917f-b3b2a85af07c" containerID="59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d" exitCode=2 Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.707166 4907 generic.go:334] "Generic (PLEG): container finished" podID="9db5f721-707b-490c-917f-b3b2a85af07c" containerID="b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7" exitCode=0 Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.707188 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerDied","Data":"ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717"} Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.707220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerDied","Data":"59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d"} Feb 26 16:05:47 crc kubenswrapper[4907]: I0226 16:05:47.707233 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerDied","Data":"b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7"} Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.322065 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.338842 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362176 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a140df23-061c-4941-855b-3c829a96d63e-etc-machine-id\") pod \"a140df23-061c-4941-855b-3c829a96d63e\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362243 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-config-data\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362294 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data-custom\") pod \"a140df23-061c-4941-855b-3c829a96d63e\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362315 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpp5j\" (UniqueName: \"kubernetes.io/projected/9db5f721-707b-490c-917f-b3b2a85af07c-kube-api-access-tpp5j\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362360 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-run-httpd\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362381 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data\") pod \"a140df23-061c-4941-855b-3c829a96d63e\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362399 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-sg-core-conf-yaml\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362433 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-log-httpd\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362454 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-combined-ca-bundle\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362470 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-combined-ca-bundle\") pod \"a140df23-061c-4941-855b-3c829a96d63e\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-scripts\") pod \"a140df23-061c-4941-855b-3c829a96d63e\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362533 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-scripts\") pod \"9db5f721-707b-490c-917f-b3b2a85af07c\" (UID: \"9db5f721-707b-490c-917f-b3b2a85af07c\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.362579 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5cwj\" (UniqueName: \"kubernetes.io/projected/a140df23-061c-4941-855b-3c829a96d63e-kube-api-access-x5cwj\") pod \"a140df23-061c-4941-855b-3c829a96d63e\" (UID: \"a140df23-061c-4941-855b-3c829a96d63e\") " Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.423274 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.424249 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a140df23-061c-4941-855b-3c829a96d63e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a140df23-061c-4941-855b-3c829a96d63e" (UID: "a140df23-061c-4941-855b-3c829a96d63e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.450078 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db5f721-707b-490c-917f-b3b2a85af07c-kube-api-access-tpp5j" (OuterVolumeSpecName: "kube-api-access-tpp5j") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "kube-api-access-tpp5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.457001 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-scripts" (OuterVolumeSpecName: "scripts") pod "a140df23-061c-4941-855b-3c829a96d63e" (UID: "a140df23-061c-4941-855b-3c829a96d63e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.459056 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-scripts" (OuterVolumeSpecName: "scripts") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.467920 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a140df23-061c-4941-855b-3c829a96d63e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.467956 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpp5j\" (UniqueName: \"kubernetes.io/projected/9db5f721-707b-490c-917f-b3b2a85af07c-kube-api-access-tpp5j\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.467977 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.467988 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.468000 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.472444 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a140df23-061c-4941-855b-3c829a96d63e" (UID: "a140df23-061c-4941-855b-3c829a96d63e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.475328 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a140df23-061c-4941-855b-3c829a96d63e-kube-api-access-x5cwj" (OuterVolumeSpecName: "kube-api-access-x5cwj") pod "a140df23-061c-4941-855b-3c829a96d63e" (UID: "a140df23-061c-4941-855b-3c829a96d63e"). InnerVolumeSpecName "kube-api-access-x5cwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.531058 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.531115 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.531154 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.532124 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39faa61e9e899f01de0dcddf00d83aac761ca87f8fd53bc6d256f2980199847a"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.532194 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://39faa61e9e899f01de0dcddf00d83aac761ca87f8fd53bc6d256f2980199847a" gracePeriod=600 Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.569984 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.570351 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5cwj\" (UniqueName: \"kubernetes.io/projected/a140df23-061c-4941-855b-3c829a96d63e-kube-api-access-x5cwj\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.642783 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.649826 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.683993 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9db5f721-707b-490c-917f-b3b2a85af07c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.684042 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.712385 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.721612 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a140df23-061c-4941-855b-3c829a96d63e" (UID: "a140df23-061c-4941-855b-3c829a96d63e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.740247 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="39faa61e9e899f01de0dcddf00d83aac761ca87f8fd53bc6d256f2980199847a" exitCode=0 Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.740341 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"39faa61e9e899f01de0dcddf00d83aac761ca87f8fd53bc6d256f2980199847a"} Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.740389 4907 scope.go:117] "RemoveContainer" containerID="2db300a26f9a65971b75abb9b1132aae00d9a358285f4cb580b858c6563b8062" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.775894 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"a140df23-061c-4941-855b-3c829a96d63e","Type":"ContainerDied","Data":"9b600d5ab8673077a9cf46fc59d20a9da044aa282fff1f10137aff37b15eb713"} Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.776044 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.786982 4907 generic.go:334] "Generic (PLEG): container finished" podID="9db5f721-707b-490c-917f-b3b2a85af07c" containerID="7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29" exitCode=0 Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.787042 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerDied","Data":"7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29"} Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.787070 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9db5f721-707b-490c-917f-b3b2a85af07c","Type":"ContainerDied","Data":"5a71a03405c136f01cda642c4e413f563b131353601cd16daa4d26321b46eaf1"} Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.787155 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.789038 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.789064 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.830908 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-58d5d7785f-4fcrq"] Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.848727 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-config-data" (OuterVolumeSpecName: "config-data") pod "9db5f721-707b-490c-917f-b3b2a85af07c" (UID: "9db5f721-707b-490c-917f-b3b2a85af07c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.865728 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data" (OuterVolumeSpecName: "config-data") pod "a140df23-061c-4941-855b-3c829a96d63e" (UID: "a140df23-061c-4941-855b-3c829a96d63e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.891167 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9db5f721-707b-490c-917f-b3b2a85af07c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.891199 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a140df23-061c-4941-855b-3c829a96d63e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.901934 4907 scope.go:117] "RemoveContainer" containerID="a625102b2c1469077b1b807e98392aef0565b381162ae06153ceb15d285d8550" Feb 26 16:05:48 crc kubenswrapper[4907]: I0226 16:05:48.937580 4907 scope.go:117] "RemoveContainer" containerID="fc2090ab19285a87d4abfaf72a5de580432719b47c4f5656ad92e60309f9e518" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.002159 4907 scope.go:117] "RemoveContainer" containerID="ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.048642 4907 scope.go:117] "RemoveContainer" containerID="59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.168356 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.187124 4907 scope.go:117] "RemoveContainer" containerID="7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.195722 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237280 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.237663 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-central-agent" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237679 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-central-agent" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.237703 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="cinder-scheduler" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237711 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="cinder-scheduler" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.237727 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-notification-agent" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237733 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-notification-agent" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.237747 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="proxy-httpd" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237753 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="proxy-httpd" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.237764 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="sg-core" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237770 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="sg-core" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.237792 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="probe" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237798 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="probe" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237950 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-notification-agent" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237962 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="cinder-scheduler" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237977 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a140df23-061c-4941-855b-3c829a96d63e" containerName="probe" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237988 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="sg-core" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.237995 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="proxy-httpd" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.238011 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" containerName="ceilometer-central-agent" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.239467 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.241969 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.242138 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.264956 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.266083 4907 scope.go:117] "RemoveContainer" containerID="b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.279558 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.289659 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300660 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-scripts\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300718 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-config-data\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300770 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300791 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-log-httpd\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300841 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-run-httpd\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300863 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ksv6\" (UniqueName: \"kubernetes.io/projected/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-kube-api-access-5ksv6\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.300903 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.303150 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.304551 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.309781 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.312637 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.375186 4907 scope.go:117] "RemoveContainer" containerID="ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.376839 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717\": container with ID starting with ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717 not found: ID does not exist" containerID="ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.376895 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717"} err="failed to get container status \"ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717\": rpc error: code = NotFound desc = could not find container \"ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717\": container with ID starting with ec96daad476960899ae24b8d7b97b3b4d5268b7123c5940d0d408ce586513717 not found: ID does not exist" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.376923 4907 scope.go:117] "RemoveContainer" containerID="59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.381863 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d\": container with ID starting with 59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d not found: ID does not exist" containerID="59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.381906 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d"} err="failed to get container status \"59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d\": rpc error: code = NotFound desc = could not find container \"59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d\": container with ID starting with 59d0fd0269ec491062ba3cef75eb411111e2fece4fdb177e8d8680c56412909d not found: ID does not exist" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.381930 4907 scope.go:117] "RemoveContainer" containerID="7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.382210 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29\": container with ID starting with 7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29 not found: ID does not exist" containerID="7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.382223 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29"} err="failed to get container status \"7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29\": rpc error: code = NotFound desc = could not find container \"7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29\": container with ID starting with 7d2427c956d607ef4cf148c0c113847a907484ad07c5294237e191bf9cdc5e29 not found: ID does not exist" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.382237 4907 scope.go:117] "RemoveContainer" containerID="b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7" Feb 26 16:05:49 crc kubenswrapper[4907]: E0226 16:05:49.397138 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7\": container with ID starting with b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7 not found: ID does not exist" containerID="b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.397179 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7"} err="failed to get container status \"b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7\": rpc error: code = NotFound desc = could not find container \"b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7\": container with ID starting with b9f6c67b6afcd8e9c24e088d3c406309aab828e08ec4e7a7619487dceedb4bd7 not found: ID does not exist" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.402946 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.402996 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-log-httpd\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403036 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz884\" (UniqueName: \"kubernetes.io/projected/00c049ce-b973-4246-ae47-5fb2a6789fbb-kube-api-access-lz884\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403073 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-scripts\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403100 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403126 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-run-httpd\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403146 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00c049ce-b973-4246-ae47-5fb2a6789fbb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403163 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ksv6\" (UniqueName: \"kubernetes.io/projected/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-kube-api-access-5ksv6\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403206 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403234 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-config-data\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403250 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-scripts\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.403286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-config-data\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.406333 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-log-httpd\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.406353 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-run-httpd\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.413180 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dbb49ff7b-8r7kc"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.414666 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6dbb49ff7b-8r7kc" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-api" containerID="cri-o://28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239" gracePeriod=30 Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.414269 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.415853 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-config-data\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.423011 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-scripts\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.423140 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.435735 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ksv6\" (UniqueName: \"kubernetes.io/projected/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-kube-api-access-5ksv6\") pod \"ceilometer-0\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.480694 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6db49c6bf7-w2792"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.482711 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.505835 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-config-data\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.506179 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.506317 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz884\" (UniqueName: \"kubernetes.io/projected/00c049ce-b973-4246-ae47-5fb2a6789fbb-kube-api-access-lz884\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.506436 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-scripts\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.506562 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.506669 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00c049ce-b973-4246-ae47-5fb2a6789fbb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.506813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/00c049ce-b973-4246-ae47-5fb2a6789fbb-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.518768 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-config-data\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.522166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-scripts\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.522753 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.533669 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00c049ce-b973-4246-ae47-5fb2a6789fbb-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.540010 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz884\" (UniqueName: \"kubernetes.io/projected/00c049ce-b973-4246-ae47-5fb2a6789fbb-kube-api-access-lz884\") pod \"cinder-scheduler-0\" (UID: \"00c049ce-b973-4246-ae47-5fb2a6789fbb\") " pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.554596 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6db49c6bf7-w2792"] Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.599327 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608042 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txl44\" (UniqueName: \"kubernetes.io/projected/3996ac72-7ea7-4e6f-a714-1a0597f15fde-kube-api-access-txl44\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608119 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-internal-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608409 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-config\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608576 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-combined-ca-bundle\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608702 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-httpd-config\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608867 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-public-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.608959 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-ovndb-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.638975 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.710748 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-internal-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.710919 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-config\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.711021 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-combined-ca-bundle\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.711102 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-httpd-config\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.711188 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-public-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.711265 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-ovndb-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.711357 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txl44\" (UniqueName: \"kubernetes.io/projected/3996ac72-7ea7-4e6f-a714-1a0597f15fde-kube-api-access-txl44\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.715783 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-public-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.724261 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-combined-ca-bundle\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.727533 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-httpd-config\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.729516 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-config\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.732796 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-internal-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.737880 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3996ac72-7ea7-4e6f-a714-1a0597f15fde-ovndb-tls-certs\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.760281 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txl44\" (UniqueName: \"kubernetes.io/projected/3996ac72-7ea7-4e6f-a714-1a0597f15fde-kube-api-access-txl44\") pod \"neutron-6db49c6bf7-w2792\" (UID: \"3996ac72-7ea7-4e6f-a714-1a0597f15fde\") " pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.799352 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.915077 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46"} Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.934268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58d5d7785f-4fcrq" event={"ID":"80cd7152-934f-40c6-925c-a3f1f9dfca95","Type":"ContainerStarted","Data":"a3eef0c515405d51201b9e24b5d2a34ccaf05b48fe6422744b930fc4ba563250"} Feb 26 16:05:49 crc kubenswrapper[4907]: I0226 16:05:49.934316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58d5d7785f-4fcrq" event={"ID":"80cd7152-934f-40c6-925c-a3f1f9dfca95","Type":"ContainerStarted","Data":"5642c193a6902562d7b56dd83a9feca4afd88c7657aff43bf1b6683b3b7dc11d"} Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.167400 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db5f721-707b-490c-917f-b3b2a85af07c" path="/var/lib/kubelet/pods/9db5f721-707b-490c-917f-b3b2a85af07c/volumes" Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.168650 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a140df23-061c-4941-855b-3c829a96d63e" path="/var/lib/kubelet/pods/a140df23-061c-4941-855b-3c829a96d63e/volumes" Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.388319 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.467746 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.171:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.519640 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:50 crc kubenswrapper[4907]: W0226 16:05:50.522680 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda114e8dd_3cb1_4b1a_8f49_48b99c39da3b.slice/crio-4f6a71d6fd6a3e58ce80a4d756b4969f440aaec2e5fa2d4cc31613127f2b96b4 WatchSource:0}: Error finding container 4f6a71d6fd6a3e58ce80a4d756b4969f440aaec2e5fa2d4cc31613127f2b96b4: Status 404 returned error can't find the container with id 4f6a71d6fd6a3e58ce80a4d756b4969f440aaec2e5fa2d4cc31613127f2b96b4 Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.683403 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.752515 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6db49c6bf7-w2792"] Feb 26 16:05:50 crc kubenswrapper[4907]: W0226 16:05:50.764760 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3996ac72_7ea7_4e6f_a714_1a0597f15fde.slice/crio-d50606e788e10e7d4b65d0f19322d7bc244927724f78b242bbb904a5b3e925a7 WatchSource:0}: Error finding container d50606e788e10e7d4b65d0f19322d7bc244927724f78b242bbb904a5b3e925a7: Status 404 returned error can't find the container with id d50606e788e10e7d4b65d0f19322d7bc244927724f78b242bbb904a5b3e925a7 Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.959401 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00c049ce-b973-4246-ae47-5fb2a6789fbb","Type":"ContainerStarted","Data":"eaeafb816808955f3f65f6a1f8d897d6618b81f9cee68cd41fdf0ea71c99d849"} Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.961202 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerStarted","Data":"4f6a71d6fd6a3e58ce80a4d756b4969f440aaec2e5fa2d4cc31613127f2b96b4"} Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.962773 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6db49c6bf7-w2792" event={"ID":"3996ac72-7ea7-4e6f-a714-1a0597f15fde","Type":"ContainerStarted","Data":"d50606e788e10e7d4b65d0f19322d7bc244927724f78b242bbb904a5b3e925a7"} Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.976606 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-58d5d7785f-4fcrq" event={"ID":"80cd7152-934f-40c6-925c-a3f1f9dfca95","Type":"ContainerStarted","Data":"f78e12a503c0673ff7242acdb95f43c658c9a60afe824f7cddd2b947b214e54e"} Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.976660 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:50 crc kubenswrapper[4907]: I0226 16:05:50.976684 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:51 crc kubenswrapper[4907]: I0226 16:05:51.009032 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-58d5d7785f-4fcrq" podStartSLOduration=4.009014226 podStartE2EDuration="4.009014226s" podCreationTimestamp="2026-02-26 16:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:51.004860847 +0000 UTC m=+1413.523422686" watchObservedRunningTime="2026-02-26 16:05:51.009014226 +0000 UTC m=+1413.527576075" Feb 26 16:05:51 crc kubenswrapper[4907]: I0226 16:05:51.218262 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6f5746579b-4xjhs" Feb 26 16:05:51 crc kubenswrapper[4907]: I0226 16:05:51.405229 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-766c5c4f46-9j8qd"] Feb 26 16:05:51 crc kubenswrapper[4907]: I0226 16:05:51.405451 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-766c5c4f46-9j8qd" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api-log" containerID="cri-o://8ca2eaf129ea9f72949ee76c0734571655cd2f8eaf2c2646647fae90c038305a" gracePeriod=30 Feb 26 16:05:51 crc kubenswrapper[4907]: I0226 16:05:51.405887 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-766c5c4f46-9j8qd" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api" containerID="cri-o://f4989c8a6447adef0894aa6de4de8e66f5c50e42f779fcb39c2666777f3c7e46" gracePeriod=30 Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.030917 4907 generic.go:334] "Generic (PLEG): container finished" podID="18111fe1-07d0-420e-bc61-457532bdb122" containerID="8ca2eaf129ea9f72949ee76c0734571655cd2f8eaf2c2646647fae90c038305a" exitCode=143 Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.031102 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766c5c4f46-9j8qd" event={"ID":"18111fe1-07d0-420e-bc61-457532bdb122","Type":"ContainerDied","Data":"8ca2eaf129ea9f72949ee76c0734571655cd2f8eaf2c2646647fae90c038305a"} Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.042951 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00c049ce-b973-4246-ae47-5fb2a6789fbb","Type":"ContainerStarted","Data":"c5f53a8df9b09114ed4629e15297abebcc193d43d5e23d0c34fefd8c801f7c31"} Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.045642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerStarted","Data":"191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3"} Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.069857 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6db49c6bf7-w2792" event={"ID":"3996ac72-7ea7-4e6f-a714-1a0597f15fde","Type":"ContainerStarted","Data":"feeedfbfec42796a24ddc06102b1fcd309c7a820666d99ea9d13ce5b06647702"} Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.991364 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/2.log" Feb 26 16:05:52 crc kubenswrapper[4907]: I0226 16:05:52.992492 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.102374 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6db49c6bf7-w2792" event={"ID":"3996ac72-7ea7-4e6f-a714-1a0597f15fde","Type":"ContainerStarted","Data":"d0a0a883dff01004d291681e23aba5acfec4d2dd11e396036ca0a77d9176b3ec"} Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.102700 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.111928 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dbb49ff7b-8r7kc_41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/neutron-httpd/2.log" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.112575 4907 generic.go:334] "Generic (PLEG): container finished" podID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerID="28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239" exitCode=0 Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.113534 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dbb49ff7b-8r7kc" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.113770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerDied","Data":"28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239"} Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.113803 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dbb49ff7b-8r7kc" event={"ID":"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3","Type":"ContainerDied","Data":"63a885ce395c9f796d02aadd511cfc684cb514a02c72dd22708f6eb5b672485b"} Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.113824 4907 scope.go:117] "RemoveContainer" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.155950 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6db49c6bf7-w2792" podStartSLOduration=4.155930983 podStartE2EDuration="4.155930983s" podCreationTimestamp="2026-02-26 16:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:53.155842401 +0000 UTC m=+1415.674404250" watchObservedRunningTime="2026-02-26 16:05:53.155930983 +0000 UTC m=+1415.674492852" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.158085 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-config\") pod \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.158163 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-combined-ca-bundle\") pod \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.158240 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtdmw\" (UniqueName: \"kubernetes.io/projected/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-kube-api-access-qtdmw\") pod \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.158267 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-httpd-config\") pod \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.158320 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-ovndb-tls-certs\") pod \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\" (UID: \"41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3\") " Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.190900 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-kube-api-access-qtdmw" (OuterVolumeSpecName: "kube-api-access-qtdmw") pod "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" (UID: "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3"). InnerVolumeSpecName "kube-api-access-qtdmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.191320 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" (UID: "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.205279 4907 scope.go:117] "RemoveContainer" containerID="28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.250886 4907 scope.go:117] "RemoveContainer" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:53 crc kubenswrapper[4907]: E0226 16:05:53.251979 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8\": container with ID starting with 2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8 not found: ID does not exist" containerID="2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.252010 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8"} err="failed to get container status \"2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8\": rpc error: code = NotFound desc = could not find container \"2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8\": container with ID starting with 2a36d1d20d0b287b23e7dcdc86288043be01d77acf00445bd9e70ca22a49b6c8 not found: ID does not exist" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.252030 4907 scope.go:117] "RemoveContainer" containerID="28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239" Feb 26 16:05:53 crc kubenswrapper[4907]: E0226 16:05:53.252489 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239\": container with ID starting with 28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239 not found: ID does not exist" containerID="28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.252514 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239"} err="failed to get container status \"28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239\": rpc error: code = NotFound desc = could not find container \"28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239\": container with ID starting with 28547d3c9f949eac0b565051b0ecc7b68dbb2b883312a78d6701a837e0cc4239 not found: ID does not exist" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.263185 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtdmw\" (UniqueName: \"kubernetes.io/projected/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-kube-api-access-qtdmw\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.263918 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.282781 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" (UID: "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.296674 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" (UID: "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.316183 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-config" (OuterVolumeSpecName: "config") pod "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" (UID: "41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.366329 4907 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.366368 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.366378 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.453125 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dbb49ff7b-8r7kc"] Feb 26 16:05:53 crc kubenswrapper[4907]: I0226 16:05:53.461420 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6dbb49ff7b-8r7kc"] Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.138844 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" path="/var/lib/kubelet/pods/41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3/volumes" Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.139821 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"00c049ce-b973-4246-ae47-5fb2a6789fbb","Type":"ContainerStarted","Data":"4085ac2625133cb2c892a62c83cea0781b8e9be2ba290e663a93fee6471e7d77"} Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.151974 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerStarted","Data":"4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849"} Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.162164 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.162144658 podStartE2EDuration="5.162144658s" podCreationTimestamp="2026-02-26 16:05:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:05:54.155895519 +0000 UTC m=+1416.674457368" watchObservedRunningTime="2026-02-26 16:05:54.162144658 +0000 UTC m=+1416.680706507" Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.640344 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.901805 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-766c5c4f46-9j8qd" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:39128->10.217.0.168:9311: read: connection reset by peer" Feb 26 16:05:54 crc kubenswrapper[4907]: I0226 16:05:54.902851 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-766c5c4f46-9j8qd" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.168:9311/healthcheck\": read tcp 10.217.0.2:39120->10.217.0.168:9311: read: connection reset by peer" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.203846 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerStarted","Data":"e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2"} Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.229051 4907 generic.go:334] "Generic (PLEG): container finished" podID="18111fe1-07d0-420e-bc61-457532bdb122" containerID="f4989c8a6447adef0894aa6de4de8e66f5c50e42f779fcb39c2666777f3c7e46" exitCode=0 Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.229882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766c5c4f46-9j8qd" event={"ID":"18111fe1-07d0-420e-bc61-457532bdb122","Type":"ContainerDied","Data":"f4989c8a6447adef0894aa6de4de8e66f5c50e42f779fcb39c2666777f3c7e46"} Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.393157 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.401049 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data-custom\") pod \"18111fe1-07d0-420e-bc61-457532bdb122\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.401098 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18111fe1-07d0-420e-bc61-457532bdb122-logs\") pod \"18111fe1-07d0-420e-bc61-457532bdb122\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.401138 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data\") pod \"18111fe1-07d0-420e-bc61-457532bdb122\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.401175 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx4kz\" (UniqueName: \"kubernetes.io/projected/18111fe1-07d0-420e-bc61-457532bdb122-kube-api-access-cx4kz\") pod \"18111fe1-07d0-420e-bc61-457532bdb122\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.401210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-combined-ca-bundle\") pod \"18111fe1-07d0-420e-bc61-457532bdb122\" (UID: \"18111fe1-07d0-420e-bc61-457532bdb122\") " Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.403068 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18111fe1-07d0-420e-bc61-457532bdb122-logs" (OuterVolumeSpecName: "logs") pod "18111fe1-07d0-420e-bc61-457532bdb122" (UID: "18111fe1-07d0-420e-bc61-457532bdb122"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.411216 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18111fe1-07d0-420e-bc61-457532bdb122-kube-api-access-cx4kz" (OuterVolumeSpecName: "kube-api-access-cx4kz") pod "18111fe1-07d0-420e-bc61-457532bdb122" (UID: "18111fe1-07d0-420e-bc61-457532bdb122"). InnerVolumeSpecName "kube-api-access-cx4kz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.422509 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "18111fe1-07d0-420e-bc61-457532bdb122" (UID: "18111fe1-07d0-420e-bc61-457532bdb122"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.500763 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18111fe1-07d0-420e-bc61-457532bdb122" (UID: "18111fe1-07d0-420e-bc61-457532bdb122"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.503226 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.503267 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18111fe1-07d0-420e-bc61-457532bdb122-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.503278 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx4kz\" (UniqueName: \"kubernetes.io/projected/18111fe1-07d0-420e-bc61-457532bdb122-kube-api-access-cx4kz\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.503291 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.512775 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.171:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.549063 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data" (OuterVolumeSpecName: "config-data") pod "18111fe1-07d0-420e-bc61-457532bdb122" (UID: "18111fe1-07d0-420e-bc61-457532bdb122"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:05:55 crc kubenswrapper[4907]: I0226 16:05:55.604447 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18111fe1-07d0-420e-bc61-457532bdb122-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.251556 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-766c5c4f46-9j8qd" event={"ID":"18111fe1-07d0-420e-bc61-457532bdb122","Type":"ContainerDied","Data":"6a521e2c83616303952eea4e1e6b6b39bd53bb69c8fd4e4d715a564a7419be77"} Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.252004 4907 scope.go:117] "RemoveContainer" containerID="f4989c8a6447adef0894aa6de4de8e66f5c50e42f779fcb39c2666777f3c7e46" Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.252975 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-766c5c4f46-9j8qd" Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.299417 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-766c5c4f46-9j8qd"] Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.309359 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-766c5c4f46-9j8qd"] Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.318314 4907 scope.go:117] "RemoveContainer" containerID="8ca2eaf129ea9f72949ee76c0734571655cd2f8eaf2c2646647fae90c038305a" Feb 26 16:05:56 crc kubenswrapper[4907]: I0226 16:05:56.861184 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:05:57 crc kubenswrapper[4907]: I0226 16:05:57.444108 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:05:57 crc kubenswrapper[4907]: I0226 16:05:57.503784 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:57 crc kubenswrapper[4907]: I0226 16:05:57.508550 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-58d5d7785f-4fcrq" Feb 26 16:05:58 crc kubenswrapper[4907]: I0226 16:05:58.068889 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 16:05:58 crc kubenswrapper[4907]: I0226 16:05:58.151386 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18111fe1-07d0-420e-bc61-457532bdb122" path="/var/lib/kubelet/pods/18111fe1-07d0-420e-bc61-457532bdb122/volumes" Feb 26 16:05:58 crc kubenswrapper[4907]: I0226 16:05:58.312838 4907 generic.go:334] "Generic (PLEG): container finished" podID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerID="5f606b9ab89532e105117c7cf76e6d48e275002733a615d726e58c1777c18aad" exitCode=137 Feb 26 16:05:58 crc kubenswrapper[4907]: I0226 16:05:58.313866 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerDied","Data":"5f606b9ab89532e105117c7cf76e6d48e275002733a615d726e58c1777c18aad"} Feb 26 16:05:59 crc kubenswrapper[4907]: I0226 16:05:59.327051 4907 generic.go:334] "Generic (PLEG): container finished" podID="b35f87c4-e535-4901-8814-0b321b201158" containerID="c2b6ec3e96a2871e49421792b819e7d8811902b2acc4ebf5cb6213f4794ef38f" exitCode=137 Feb 26 16:05:59 crc kubenswrapper[4907]: I0226 16:05:59.327194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerDied","Data":"c2b6ec3e96a2871e49421792b819e7d8811902b2acc4ebf5cb6213f4794ef38f"} Feb 26 16:05:59 crc kubenswrapper[4907]: I0226 16:05:59.946232 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.167918 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535366-dqhh6"] Feb 26 16:06:00 crc kubenswrapper[4907]: E0226 16:06:00.168384 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-api" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168405 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-api" Feb 26 16:06:00 crc kubenswrapper[4907]: E0226 16:06:00.168434 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168442 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:00 crc kubenswrapper[4907]: E0226 16:06:00.168453 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api-log" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168461 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api-log" Feb 26 16:06:00 crc kubenswrapper[4907]: E0226 16:06:00.168485 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168492 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:00 crc kubenswrapper[4907]: E0226 16:06:00.168505 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168513 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168781 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api-log" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168800 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-api" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168814 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168823 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.168831 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="18111fe1-07d0-420e-bc61-457532bdb122" containerName="barbican-api" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.169611 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.173034 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.173224 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.175941 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.189230 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-dqhh6"] Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.301245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq4f5\" (UniqueName: \"kubernetes.io/projected/023cbc5f-da0e-4a5e-bc63-18385f44d228-kube-api-access-bq4f5\") pod \"auto-csr-approver-29535366-dqhh6\" (UID: \"023cbc5f-da0e-4a5e-bc63-18385f44d228\") " pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.405388 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq4f5\" (UniqueName: \"kubernetes.io/projected/023cbc5f-da0e-4a5e-bc63-18385f44d228-kube-api-access-bq4f5\") pod \"auto-csr-approver-29535366-dqhh6\" (UID: \"023cbc5f-da0e-4a5e-bc63-18385f44d228\") " pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.437334 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq4f5\" (UniqueName: \"kubernetes.io/projected/023cbc5f-da0e-4a5e-bc63-18385f44d228-kube-api-access-bq4f5\") pod \"auto-csr-approver-29535366-dqhh6\" (UID: \"023cbc5f-da0e-4a5e-bc63-18385f44d228\") " pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:00 crc kubenswrapper[4907]: I0226 16:06:00.494436 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:06 crc kubenswrapper[4907]: I0226 16:06:06.109198 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:06 crc kubenswrapper[4907]: I0226 16:06:06.109971 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-log" containerID="cri-o://5b645c4cc55c466b58e79b5f1292c773cf90139e56d5e08d260e34f754fdac57" gracePeriod=30 Feb 26 16:06:06 crc kubenswrapper[4907]: I0226 16:06:06.110077 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-httpd" containerID="cri-o://bcdf7f251072c281b799d39208a89ff8fa1387f0ca8230ec0b2263b3f0d3c06e" gracePeriod=30 Feb 26 16:06:06 crc kubenswrapper[4907]: I0226 16:06:06.429200 4907 generic.go:334] "Generic (PLEG): container finished" podID="361750c4-3d82-437e-abc0-4e20302d20cf" containerID="5b645c4cc55c466b58e79b5f1292c773cf90139e56d5e08d260e34f754fdac57" exitCode=143 Feb 26 16:06:06 crc kubenswrapper[4907]: I0226 16:06:06.429658 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"361750c4-3d82-437e-abc0-4e20302d20cf","Type":"ContainerDied","Data":"5b645c4cc55c466b58e79b5f1292c773cf90139e56d5e08d260e34f754fdac57"} Feb 26 16:06:06 crc kubenswrapper[4907]: I0226 16:06:06.720451 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-dqhh6"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:06.987706 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-k8vd5"] Feb 26 16:06:07 crc kubenswrapper[4907]: E0226 16:06:06.988118 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:06.988129 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:06.988288 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="41b49bfa-e783-4c0f-a0f6-f8dfdd5771d3" containerName="neutron-httpd" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.003743 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.021250 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-k8vd5"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.069999 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693a0231-a18d-4141-a46f-5911644101a4-operator-scripts\") pod \"nova-api-db-create-k8vd5\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.070176 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4v5p\" (UniqueName: \"kubernetes.io/projected/693a0231-a18d-4141-a46f-5911644101a4-kube-api-access-k4v5p\") pod \"nova-api-db-create-k8vd5\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.118499 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-pn8lr"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.120059 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.164288 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pn8lr"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.174347 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4v5p\" (UniqueName: \"kubernetes.io/projected/693a0231-a18d-4141-a46f-5911644101a4-kube-api-access-k4v5p\") pod \"nova-api-db-create-k8vd5\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.180347 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693a0231-a18d-4141-a46f-5911644101a4-operator-scripts\") pod \"nova-api-db-create-k8vd5\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.182468 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693a0231-a18d-4141-a46f-5911644101a4-operator-scripts\") pod \"nova-api-db-create-k8vd5\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.186141 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.234249 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-5hgql"] Feb 26 16:06:07 crc kubenswrapper[4907]: E0226 16:06:07.234961 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api-log" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.234985 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api-log" Feb 26 16:06:07 crc kubenswrapper[4907]: E0226 16:06:07.235014 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.235022 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.235243 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.235268 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerName="cinder-api-log" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.236039 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.243721 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4v5p\" (UniqueName: \"kubernetes.io/projected/693a0231-a18d-4141-a46f-5911644101a4-kube-api-access-k4v5p\") pod \"nova-api-db-create-k8vd5\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.269023 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2e48-account-create-update-8mvk9"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.270707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.277921 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-etc-machine-id\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287565 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-combined-ca-bundle\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287630 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-logs\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287722 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w58f8\" (UniqueName: \"kubernetes.io/projected/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-kube-api-access-w58f8\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287752 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data-custom\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287771 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-scripts\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.287837 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data\") pod \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\" (UID: \"466a75e1-c85d-4d33-b9c7-6916eca1ebe1\") " Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288245 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ct9q\" (UniqueName: \"kubernetes.io/projected/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-kube-api-access-7ct9q\") pod \"nova-cell1-db-create-5hgql\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288296 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-operator-scripts\") pod \"nova-cell1-db-create-5hgql\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288333 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qts\" (UniqueName: \"kubernetes.io/projected/9022005c-a270-4ad2-b526-10bb125dfff3-kube-api-access-f2qts\") pod \"nova-api-2e48-account-create-update-8mvk9\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288384 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhvxd\" (UniqueName: \"kubernetes.io/projected/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-kube-api-access-rhvxd\") pod \"nova-cell0-db-create-pn8lr\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288384 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288412 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-operator-scripts\") pod \"nova-cell0-db-create-pn8lr\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288474 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9022005c-a270-4ad2-b526-10bb125dfff3-operator-scripts\") pod \"nova-api-2e48-account-create-update-8mvk9\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.288540 4907 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.289288 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-logs" (OuterVolumeSpecName: "logs") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.296082 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5hgql"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.330341 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-kube-api-access-w58f8" (OuterVolumeSpecName: "kube-api-access-w58f8") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "kube-api-access-w58f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.335599 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.342747 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-scripts" (OuterVolumeSpecName: "scripts") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.343707 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.384412 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393109 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9022005c-a270-4ad2-b526-10bb125dfff3-operator-scripts\") pod \"nova-api-2e48-account-create-update-8mvk9\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393229 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ct9q\" (UniqueName: \"kubernetes.io/projected/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-kube-api-access-7ct9q\") pod \"nova-cell1-db-create-5hgql\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393280 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-operator-scripts\") pod \"nova-cell1-db-create-5hgql\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393312 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qts\" (UniqueName: \"kubernetes.io/projected/9022005c-a270-4ad2-b526-10bb125dfff3-kube-api-access-f2qts\") pod \"nova-api-2e48-account-create-update-8mvk9\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393354 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhvxd\" (UniqueName: \"kubernetes.io/projected/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-kube-api-access-rhvxd\") pod \"nova-cell0-db-create-pn8lr\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393385 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-operator-scripts\") pod \"nova-cell0-db-create-pn8lr\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393479 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393490 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393499 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w58f8\" (UniqueName: \"kubernetes.io/projected/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-kube-api-access-w58f8\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393508 4907 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.393516 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.394160 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-operator-scripts\") pod \"nova-cell0-db-create-pn8lr\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.394759 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9022005c-a270-4ad2-b526-10bb125dfff3-operator-scripts\") pod \"nova-api-2e48-account-create-update-8mvk9\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.397912 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-operator-scripts\") pod \"nova-cell1-db-create-5hgql\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.441264 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhvxd\" (UniqueName: \"kubernetes.io/projected/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-kube-api-access-rhvxd\") pod \"nova-cell0-db-create-pn8lr\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.443161 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2e48-account-create-update-8mvk9"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.444376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ct9q\" (UniqueName: \"kubernetes.io/projected/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-kube-api-access-7ct9q\") pod \"nova-cell1-db-create-5hgql\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.454881 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qts\" (UniqueName: \"kubernetes.io/projected/9022005c-a270-4ad2-b526-10bb125dfff3-kube-api-access-f2qts\") pod \"nova-api-2e48-account-create-update-8mvk9\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.481841 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" event={"ID":"023cbc5f-da0e-4a5e-bc63-18385f44d228","Type":"ContainerStarted","Data":"621aed023f155a1220fdb18cb5c4a8114e7c89cadc98c5aee915d6d6f602caac"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.489801 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data" (OuterVolumeSpecName: "config-data") pod "466a75e1-c85d-4d33-b9c7-6916eca1ebe1" (UID: "466a75e1-c85d-4d33-b9c7-6916eca1ebe1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.495633 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466a75e1-c85d-4d33-b9c7-6916eca1ebe1-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.500559 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerStarted","Data":"0e9ea68de0c1e921e9ed4ee0e299561d11e0b96c063a8d42fd8a0ea1f0193bee"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.523795 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerStarted","Data":"a09830ab9c067f94a8fe072a6ed8e9195e12c6c572d7b1467cb8afc38542fb22"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.538833 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-870e-account-create-update-4v7m7"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.539970 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.540879 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa","Type":"ContainerStarted","Data":"79b214bf9a33ad3b4b7edffcb0ba6ecabdc2175214a6e8e6fc51650fc0c745a3"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.549034 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.561929 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-870e-account-create-update-4v7m7"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.562372 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.578666 4907 generic.go:334] "Generic (PLEG): container finished" podID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" containerID="ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317" exitCode=137 Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.578746 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466a75e1-c85d-4d33-b9c7-6916eca1ebe1","Type":"ContainerDied","Data":"ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.578771 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"466a75e1-c85d-4d33-b9c7-6916eca1ebe1","Type":"ContainerDied","Data":"e8ababb499c81f65ef140ef8e984f39a8a7bad3f400ba836d1870012d035b066"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.578788 4907 scope.go:117] "RemoveContainer" containerID="ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.578912 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.587381 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.597357 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b61b535-465a-4786-bba7-c33c3c5672a7-operator-scripts\") pod \"nova-cell0-870e-account-create-update-4v7m7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.597580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9v9d\" (UniqueName: \"kubernetes.io/projected/1b61b535-465a-4786-bba7-c33c3c5672a7-kube-api-access-q9v9d\") pod \"nova-cell0-870e-account-create-update-4v7m7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.608780 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerStarted","Data":"c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f"} Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.608949 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-central-agent" containerID="cri-o://191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3" gracePeriod=30 Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.609196 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.609231 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="proxy-httpd" containerID="cri-o://c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f" gracePeriod=30 Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.609272 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="sg-core" containerID="cri-o://e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2" gracePeriod=30 Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.609303 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-notification-agent" containerID="cri-o://4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849" gracePeriod=30 Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.670282 4907 scope.go:117] "RemoveContainer" containerID="ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.708107 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b61b535-465a-4786-bba7-c33c3c5672a7-operator-scripts\") pod \"nova-cell0-870e-account-create-update-4v7m7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.708301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9v9d\" (UniqueName: \"kubernetes.io/projected/1b61b535-465a-4786-bba7-c33c3c5672a7-kube-api-access-q9v9d\") pod \"nova-cell0-870e-account-create-update-4v7m7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.710283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b61b535-465a-4786-bba7-c33c3c5672a7-operator-scripts\") pod \"nova-cell0-870e-account-create-update-4v7m7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.721529 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.946700426 podStartE2EDuration="25.721507936s" podCreationTimestamp="2026-02-26 16:05:42 +0000 UTC" firstStartedPulling="2026-02-26 16:05:43.619071189 +0000 UTC m=+1406.137633048" lastFinishedPulling="2026-02-26 16:06:06.393878709 +0000 UTC m=+1428.912440558" observedRunningTime="2026-02-26 16:06:07.598205587 +0000 UTC m=+1430.116767436" watchObservedRunningTime="2026-02-26 16:06:07.721507936 +0000 UTC m=+1430.240069785" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.723255 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.751016 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.751818 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.759252 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9v9d\" (UniqueName: \"kubernetes.io/projected/1b61b535-465a-4786-bba7-c33c3c5672a7-kube-api-access-q9v9d\") pod \"nova-cell0-870e-account-create-update-4v7m7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.785724 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0586-account-create-update-p76kt"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.787808 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.791264 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.805132 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0586-account-create-update-p76kt"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.809904 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbgl\" (UniqueName: \"kubernetes.io/projected/e3b617db-d4f3-448a-b544-0cd38d51728b-kube-api-access-5dbgl\") pod \"nova-cell1-0586-account-create-update-p76kt\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.823651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b617db-d4f3-448a-b544-0cd38d51728b-operator-scripts\") pod \"nova-cell1-0586-account-create-update-p76kt\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.820264 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.921450119 podStartE2EDuration="18.820240654s" podCreationTimestamp="2026-02-26 16:05:49 +0000 UTC" firstStartedPulling="2026-02-26 16:05:50.528463332 +0000 UTC m=+1413.047025181" lastFinishedPulling="2026-02-26 16:06:06.427253867 +0000 UTC m=+1428.945815716" observedRunningTime="2026-02-26 16:06:07.67630948 +0000 UTC m=+1430.194871329" watchObservedRunningTime="2026-02-26 16:06:07.820240654 +0000 UTC m=+1430.338802513" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.866756 4907 scope.go:117] "RemoveContainer" containerID="ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317" Feb 26 16:06:07 crc kubenswrapper[4907]: E0226 16:06:07.871132 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317\": container with ID starting with ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317 not found: ID does not exist" containerID="ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.871169 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317"} err="failed to get container status \"ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317\": rpc error: code = NotFound desc = could not find container \"ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317\": container with ID starting with ed7e31e54a126efdd2512d4f3d279de091b4217cb1a0424836476da4c8d3b317 not found: ID does not exist" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.871197 4907 scope.go:117] "RemoveContainer" containerID="ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31" Feb 26 16:06:07 crc kubenswrapper[4907]: E0226 16:06:07.871868 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31\": container with ID starting with ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31 not found: ID does not exist" containerID="ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.871890 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31"} err="failed to get container status \"ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31\": rpc error: code = NotFound desc = could not find container \"ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31\": container with ID starting with ca3cee29ee6bdb8de8f7b2a9bc3d4fb4b429a63857b4f97f82685c2164a62a31 not found: ID does not exist" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.879214 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.903321 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.973688 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dbgl\" (UniqueName: \"kubernetes.io/projected/e3b617db-d4f3-448a-b544-0cd38d51728b-kube-api-access-5dbgl\") pod \"nova-cell1-0586-account-create-update-p76kt\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.974009 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b617db-d4f3-448a-b544-0cd38d51728b-operator-scripts\") pod \"nova-cell1-0586-account-create-update-p76kt\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:07 crc kubenswrapper[4907]: I0226 16:06:07.975436 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b617db-d4f3-448a-b544-0cd38d51728b-operator-scripts\") pod \"nova-cell1-0586-account-create-update-p76kt\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.014209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dbgl\" (UniqueName: \"kubernetes.io/projected/e3b617db-d4f3-448a-b544-0cd38d51728b-kube-api-access-5dbgl\") pod \"nova-cell1-0586-account-create-update-p76kt\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.026153 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.091878 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.202609 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.234401 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.240530 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.245832 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.257838 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.283938 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="466a75e1-c85d-4d33-b9c7-6916eca1ebe1" path="/var/lib/kubelet/pods/466a75e1-c85d-4d33-b9c7-6916eca1ebe1/volumes" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.284819 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.284840 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.284887 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.318877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-config-data\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319040 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-scripts\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319131 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319202 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319315 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193a5b34-9a06-4c8d-b3bc-53bc62485387-etc-machine-id\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319419 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2w6j\" (UniqueName: \"kubernetes.io/projected/193a5b34-9a06-4c8d-b3bc-53bc62485387-kube-api-access-d2w6j\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319491 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-config-data-custom\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-public-tls-certs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.319664 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/193a5b34-9a06-4c8d-b3bc-53bc62485387-logs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.325457 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-k8vd5"] Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.410712 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-pn8lr"] Feb 26 16:06:08 crc kubenswrapper[4907]: W0226 16:06:08.430962 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod835cf533_cc08_4ce6_b0e1_ed3a8a2a88ea.slice/crio-ae2d0786139cc72e23d97e43d7798975f34d873b01394fb2a49b17e3406a9c5b WatchSource:0}: Error finding container ae2d0786139cc72e23d97e43d7798975f34d873b01394fb2a49b17e3406a9c5b: Status 404 returned error can't find the container with id ae2d0786139cc72e23d97e43d7798975f34d873b01394fb2a49b17e3406a9c5b Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.440994 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-config-data\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441353 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-scripts\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441404 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441437 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441562 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193a5b34-9a06-4c8d-b3bc-53bc62485387-etc-machine-id\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441669 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2w6j\" (UniqueName: \"kubernetes.io/projected/193a5b34-9a06-4c8d-b3bc-53bc62485387-kube-api-access-d2w6j\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441699 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-config-data-custom\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441762 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-public-tls-certs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.441797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/193a5b34-9a06-4c8d-b3bc-53bc62485387-logs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.442258 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/193a5b34-9a06-4c8d-b3bc-53bc62485387-logs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.443933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/193a5b34-9a06-4c8d-b3bc-53bc62485387-etc-machine-id\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.449967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-scripts\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.461378 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.463827 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-config-data-custom\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.466176 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.473828 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-public-tls-certs\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.482452 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/193a5b34-9a06-4c8d-b3bc-53bc62485387-config-data\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.486603 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2w6j\" (UniqueName: \"kubernetes.io/projected/193a5b34-9a06-4c8d-b3bc-53bc62485387-kube-api-access-d2w6j\") pod \"cinder-api-0\" (UID: \"193a5b34-9a06-4c8d-b3bc-53bc62485387\") " pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.660976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-5hgql"] Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.661382 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.674119 4907 generic.go:334] "Generic (PLEG): container finished" podID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerID="e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2" exitCode=2 Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.674148 4907 generic.go:334] "Generic (PLEG): container finished" podID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerID="191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3" exitCode=0 Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.674184 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerDied","Data":"e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2"} Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.674209 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerDied","Data":"191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3"} Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.675131 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pn8lr" event={"ID":"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea","Type":"ContainerStarted","Data":"ae2d0786139cc72e23d97e43d7798975f34d873b01394fb2a49b17e3406a9c5b"} Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.676993 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k8vd5" event={"ID":"693a0231-a18d-4141-a46f-5911644101a4","Type":"ContainerStarted","Data":"0e2d12d1df8f623027fe578d582bed6cdea4d39e9f94c8eaa87c4a15945a0144"} Feb 26 16:06:08 crc kubenswrapper[4907]: W0226 16:06:08.708188 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode97b768b_99a2_4a89_b88e_e5ccbbf8d23f.slice/crio-2e935ecc72d026859daeb8f7e5670aecc42b781dd354a091c07a8eb4beb9dde6 WatchSource:0}: Error finding container 2e935ecc72d026859daeb8f7e5670aecc42b781dd354a091c07a8eb4beb9dde6: Status 404 returned error can't find the container with id 2e935ecc72d026859daeb8f7e5670aecc42b781dd354a091c07a8eb4beb9dde6 Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.875297 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2e48-account-create-update-8mvk9"] Feb 26 16:06:08 crc kubenswrapper[4907]: I0226 16:06:08.947747 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0586-account-create-update-p76kt"] Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.054304 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-870e-account-create-update-4v7m7"] Feb 26 16:06:09 crc kubenswrapper[4907]: W0226 16:06:09.192860 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b61b535_465a_4786_bba7_c33c3c5672a7.slice/crio-5778777129ea7b014322215f9154f4d0b86180e7b12629aca02b613873f3b3e5 WatchSource:0}: Error finding container 5778777129ea7b014322215f9154f4d0b86180e7b12629aca02b613873f3b3e5: Status 404 returned error can't find the container with id 5778777129ea7b014322215f9154f4d0b86180e7b12629aca02b613873f3b3e5 Feb 26 16:06:09 crc kubenswrapper[4907]: W0226 16:06:09.418732 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod193a5b34_9a06_4c8d_b3bc_53bc62485387.slice/crio-4df985c4d4eccc4d306fd11678dad67e014380805d084040ada76407bc5f4c0f WatchSource:0}: Error finding container 4df985c4d4eccc4d306fd11678dad67e014380805d084040ada76407bc5f4c0f: Status 404 returned error can't find the container with id 4df985c4d4eccc4d306fd11678dad67e014380805d084040ada76407bc5f4c0f Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.430467 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.705503 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"193a5b34-9a06-4c8d-b3bc-53bc62485387","Type":"ContainerStarted","Data":"4df985c4d4eccc4d306fd11678dad67e014380805d084040ada76407bc5f4c0f"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.711875 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5hgql" event={"ID":"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f","Type":"ContainerStarted","Data":"ece086edb6b098d7879ce0ac6c6001c702f186355bc8ed7b7ba91efeddeeb86c"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.711932 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5hgql" event={"ID":"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f","Type":"ContainerStarted","Data":"2e935ecc72d026859daeb8f7e5670aecc42b781dd354a091c07a8eb4beb9dde6"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.734249 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-5hgql" podStartSLOduration=2.734227498 podStartE2EDuration="2.734227498s" podCreationTimestamp="2026-02-26 16:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:09.730618899 +0000 UTC m=+1432.249180768" watchObservedRunningTime="2026-02-26 16:06:09.734227498 +0000 UTC m=+1432.252789337" Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.747639 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" event={"ID":"1b61b535-465a-4786-bba7-c33c3c5672a7","Type":"ContainerStarted","Data":"5778777129ea7b014322215f9154f4d0b86180e7b12629aca02b613873f3b3e5"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.808963 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" podStartSLOduration=2.808939837 podStartE2EDuration="2.808939837s" podCreationTimestamp="2026-02-26 16:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:09.776351569 +0000 UTC m=+1432.294913438" watchObservedRunningTime="2026-02-26 16:06:09.808939837 +0000 UTC m=+1432.327501686" Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.812265 4907 generic.go:334] "Generic (PLEG): container finished" podID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerID="4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849" exitCode=0 Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.812355 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerDied","Data":"4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.832357 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2e48-account-create-update-8mvk9" event={"ID":"9022005c-a270-4ad2-b526-10bb125dfff3","Type":"ContainerStarted","Data":"539e079b2271a050016ea91859cf6254983d307241063136feeef7db1658b543"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.846672 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0586-account-create-update-p76kt" event={"ID":"e3b617db-d4f3-448a-b544-0cd38d51728b","Type":"ContainerStarted","Data":"9530360cbde14199aa44f419d8e6c61aacd345997005a7cb6750255f3d1ec4f0"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.874088 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k8vd5" event={"ID":"693a0231-a18d-4141-a46f-5911644101a4","Type":"ContainerStarted","Data":"be57f9925a937096f9d5f1073c8e249f92a0cdc53e6b154b95350aeea9a8f037"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.888196 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-0586-account-create-update-p76kt" podStartSLOduration=2.888171157 podStartE2EDuration="2.888171157s" podCreationTimestamp="2026-02-26 16:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:09.882216731 +0000 UTC m=+1432.400778580" watchObservedRunningTime="2026-02-26 16:06:09.888171157 +0000 UTC m=+1432.406733006" Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.906961 4907 generic.go:334] "Generic (PLEG): container finished" podID="361750c4-3d82-437e-abc0-4e20302d20cf" containerID="bcdf7f251072c281b799d39208a89ff8fa1387f0ca8230ec0b2263b3f0d3c06e" exitCode=0 Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.907030 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"361750c4-3d82-437e-abc0-4e20302d20cf","Type":"ContainerDied","Data":"bcdf7f251072c281b799d39208a89ff8fa1387f0ca8230ec0b2263b3f0d3c06e"} Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.920418 4907 generic.go:334] "Generic (PLEG): container finished" podID="835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" containerID="42272f3871ae911865b015890b9c14369f0fb3712a8201d406df6721ba5053ed" exitCode=0 Feb 26 16:06:09 crc kubenswrapper[4907]: I0226 16:06:09.921998 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pn8lr" event={"ID":"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea","Type":"ContainerDied","Data":"42272f3871ae911865b015890b9c14369f0fb3712a8201d406df6721ba5053ed"} Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.260696 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362143 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-config-data\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-logs\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362267 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-public-tls-certs\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362373 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-httpd-run\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362448 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-combined-ca-bundle\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362480 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362517 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-scripts\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.362561 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7bx2\" (UniqueName: \"kubernetes.io/projected/361750c4-3d82-437e-abc0-4e20302d20cf-kube-api-access-w7bx2\") pod \"361750c4-3d82-437e-abc0-4e20302d20cf\" (UID: \"361750c4-3d82-437e-abc0-4e20302d20cf\") " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.363127 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.364113 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-logs" (OuterVolumeSpecName: "logs") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.370878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/361750c4-3d82-437e-abc0-4e20302d20cf-kube-api-access-w7bx2" (OuterVolumeSpecName: "kube-api-access-w7bx2") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "kube-api-access-w7bx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.379397 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.384493 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-scripts" (OuterVolumeSpecName: "scripts") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.467239 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.467318 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.467333 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.467448 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7bx2\" (UniqueName: \"kubernetes.io/projected/361750c4-3d82-437e-abc0-4e20302d20cf-kube-api-access-w7bx2\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.467468 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/361750c4-3d82-437e-abc0-4e20302d20cf-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.491681 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.509751 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.518500 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.544214 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-config-data" (OuterVolumeSpecName: "config-data") pod "361750c4-3d82-437e-abc0-4e20302d20cf" (UID: "361750c4-3d82-437e-abc0-4e20302d20cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.568562 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.568606 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.568619 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.568635 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/361750c4-3d82-437e-abc0-4e20302d20cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.935176 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" event={"ID":"023cbc5f-da0e-4a5e-bc63-18385f44d228","Type":"ContainerStarted","Data":"8e672af27d5f3faf809d9b7e50e114800481d22f3fccd9f003a26b0133dccd51"} Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.945326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"361750c4-3d82-437e-abc0-4e20302d20cf","Type":"ContainerDied","Data":"2eedba418a9ee29b10b82662a95c406340fe1d301e7879dc616010c9ba3b8792"} Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.945378 4907 scope.go:117] "RemoveContainer" containerID="bcdf7f251072c281b799d39208a89ff8fa1387f0ca8230ec0b2263b3f0d3c06e" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.945511 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.963315 4907 generic.go:334] "Generic (PLEG): container finished" podID="e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" containerID="ece086edb6b098d7879ce0ac6c6001c702f186355bc8ed7b7ba91efeddeeb86c" exitCode=0 Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.963945 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5hgql" event={"ID":"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f","Type":"ContainerDied","Data":"ece086edb6b098d7879ce0ac6c6001c702f186355bc8ed7b7ba91efeddeeb86c"} Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.967369 4907 generic.go:334] "Generic (PLEG): container finished" podID="1b61b535-465a-4786-bba7-c33c3c5672a7" containerID="31c89ce7138a9b38c8adebdce017bebb3b83fb10bd86f4d1277c6e732265a1f8" exitCode=0 Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.967439 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" event={"ID":"1b61b535-465a-4786-bba7-c33c3c5672a7","Type":"ContainerDied","Data":"31c89ce7138a9b38c8adebdce017bebb3b83fb10bd86f4d1277c6e732265a1f8"} Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.967434 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" podStartSLOduration=9.372405408 podStartE2EDuration="10.967409123s" podCreationTimestamp="2026-02-26 16:06:00 +0000 UTC" firstStartedPulling="2026-02-26 16:06:06.731839694 +0000 UTC m=+1429.250401543" lastFinishedPulling="2026-02-26 16:06:08.326843409 +0000 UTC m=+1430.845405258" observedRunningTime="2026-02-26 16:06:10.954433665 +0000 UTC m=+1433.472995514" watchObservedRunningTime="2026-02-26 16:06:10.967409123 +0000 UTC m=+1433.485970972" Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.983919 4907 generic.go:334] "Generic (PLEG): container finished" podID="9022005c-a270-4ad2-b526-10bb125dfff3" containerID="5ba3eca10926a5a21a0d77adac5eccd592a1bba624a8872656821c5c6390f8ab" exitCode=0 Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.984163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2e48-account-create-update-8mvk9" event={"ID":"9022005c-a270-4ad2-b526-10bb125dfff3","Type":"ContainerDied","Data":"5ba3eca10926a5a21a0d77adac5eccd592a1bba624a8872656821c5c6390f8ab"} Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.994659 4907 generic.go:334] "Generic (PLEG): container finished" podID="e3b617db-d4f3-448a-b544-0cd38d51728b" containerID="58c078032ab12fb83c6311cbdcdc3b8ee96d3cfbe4c26e379f2a6f7f7387574a" exitCode=0 Feb 26 16:06:10 crc kubenswrapper[4907]: I0226 16:06:10.994825 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0586-account-create-update-p76kt" event={"ID":"e3b617db-d4f3-448a-b544-0cd38d51728b","Type":"ContainerDied","Data":"58c078032ab12fb83c6311cbdcdc3b8ee96d3cfbe4c26e379f2a6f7f7387574a"} Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.001537 4907 generic.go:334] "Generic (PLEG): container finished" podID="693a0231-a18d-4141-a46f-5911644101a4" containerID="be57f9925a937096f9d5f1073c8e249f92a0cdc53e6b154b95350aeea9a8f037" exitCode=0 Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.001642 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k8vd5" event={"ID":"693a0231-a18d-4141-a46f-5911644101a4","Type":"ContainerDied","Data":"be57f9925a937096f9d5f1073c8e249f92a0cdc53e6b154b95350aeea9a8f037"} Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.028330 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"193a5b34-9a06-4c8d-b3bc-53bc62485387","Type":"ContainerStarted","Data":"7f24aaf3e6cc86ddd11f6f64cd31f562ac1234276cbc9ea3742dbc2bf8f0fc55"} Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.035657 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.046957 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.064383 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:11 crc kubenswrapper[4907]: E0226 16:06:11.064992 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-log" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.065011 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-log" Feb 26 16:06:11 crc kubenswrapper[4907]: E0226 16:06:11.065034 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-httpd" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.065040 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-httpd" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.065272 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-httpd" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.065301 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" containerName="glance-log" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.066309 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.080813 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.081932 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.136976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.183836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-config-data\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184072 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86460377-004c-4908-be6f-328241e8b5fb-logs\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184153 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-scripts\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184218 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184287 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86460377-004c-4908-be6f-328241e8b5fb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184386 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4btb6\" (UniqueName: \"kubernetes.io/projected/86460377-004c-4908-be6f-328241e8b5fb-kube-api-access-4btb6\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184520 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.184639 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.207751 4907 scope.go:117] "RemoveContainer" containerID="5b645c4cc55c466b58e79b5f1292c773cf90139e56d5e08d260e34f754fdac57" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286621 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-config-data\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286657 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86460377-004c-4908-be6f-328241e8b5fb-logs\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286679 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286694 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-scripts\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86460377-004c-4908-be6f-328241e8b5fb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286789 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4btb6\" (UniqueName: \"kubernetes.io/projected/86460377-004c-4908-be6f-328241e8b5fb-kube-api-access-4btb6\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286820 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.286854 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.288401 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.288502 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/86460377-004c-4908-be6f-328241e8b5fb-logs\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.288838 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/86460377-004c-4908-be6f-328241e8b5fb-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.315248 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.316323 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.322476 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-scripts\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.327705 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4btb6\" (UniqueName: \"kubernetes.io/projected/86460377-004c-4908-be6f-328241e8b5fb-kube-api-access-4btb6\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.328651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86460377-004c-4908-be6f-328241e8b5fb-config-data\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.383047 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"86460377-004c-4908-be6f-328241e8b5fb\") " pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.485074 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.721522 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.763379 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.822807 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4v5p\" (UniqueName: \"kubernetes.io/projected/693a0231-a18d-4141-a46f-5911644101a4-kube-api-access-k4v5p\") pod \"693a0231-a18d-4141-a46f-5911644101a4\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.822977 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-operator-scripts\") pod \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.823025 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhvxd\" (UniqueName: \"kubernetes.io/projected/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-kube-api-access-rhvxd\") pod \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\" (UID: \"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea\") " Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.823113 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693a0231-a18d-4141-a46f-5911644101a4-operator-scripts\") pod \"693a0231-a18d-4141-a46f-5911644101a4\" (UID: \"693a0231-a18d-4141-a46f-5911644101a4\") " Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.828169 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693a0231-a18d-4141-a46f-5911644101a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "693a0231-a18d-4141-a46f-5911644101a4" (UID: "693a0231-a18d-4141-a46f-5911644101a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.830224 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" (UID: "835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.840237 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-kube-api-access-rhvxd" (OuterVolumeSpecName: "kube-api-access-rhvxd") pod "835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" (UID: "835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea"). InnerVolumeSpecName "kube-api-access-rhvxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.856455 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693a0231-a18d-4141-a46f-5911644101a4-kube-api-access-k4v5p" (OuterVolumeSpecName: "kube-api-access-k4v5p") pod "693a0231-a18d-4141-a46f-5911644101a4" (UID: "693a0231-a18d-4141-a46f-5911644101a4"). InnerVolumeSpecName "kube-api-access-k4v5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.924973 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhvxd\" (UniqueName: \"kubernetes.io/projected/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-kube-api-access-rhvxd\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.925004 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693a0231-a18d-4141-a46f-5911644101a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.925018 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4v5p\" (UniqueName: \"kubernetes.io/projected/693a0231-a18d-4141-a46f-5911644101a4-kube-api-access-k4v5p\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:11 crc kubenswrapper[4907]: I0226 16:06:11.925031 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.051569 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-pn8lr" event={"ID":"835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea","Type":"ContainerDied","Data":"ae2d0786139cc72e23d97e43d7798975f34d873b01394fb2a49b17e3406a9c5b"} Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.051880 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae2d0786139cc72e23d97e43d7798975f34d873b01394fb2a49b17e3406a9c5b" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.051612 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-pn8lr" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.061816 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-k8vd5" event={"ID":"693a0231-a18d-4141-a46f-5911644101a4","Type":"ContainerDied","Data":"0e2d12d1df8f623027fe578d582bed6cdea4d39e9f94c8eaa87c4a15945a0144"} Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.061856 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e2d12d1df8f623027fe578d582bed6cdea4d39e9f94c8eaa87c4a15945a0144" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.061932 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-k8vd5" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.069060 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"193a5b34-9a06-4c8d-b3bc-53bc62485387","Type":"ContainerStarted","Data":"80d7720c7cf9fc04e5430239078e45cc58ad218b658369910c27b042ba383351"} Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.070420 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.097763 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.097745479 podStartE2EDuration="5.097745479s" podCreationTimestamp="2026-02-26 16:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:12.096682213 +0000 UTC m=+1434.615244072" watchObservedRunningTime="2026-02-26 16:06:12.097745479 +0000 UTC m=+1434.616307328" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.169427 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="361750c4-3d82-437e-abc0-4e20302d20cf" path="/var/lib/kubelet/pods/361750c4-3d82-437e-abc0-4e20302d20cf/volumes" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.415819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 16:06:12 crc kubenswrapper[4907]: W0226 16:06:12.477901 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86460377_004c_4908_be6f_328241e8b5fb.slice/crio-eec53f96098c128f3af532f30e50756e9b11bc0ba9a29850201985a8e2c90e03 WatchSource:0}: Error finding container eec53f96098c128f3af532f30e50756e9b11bc0ba9a29850201985a8e2c90e03: Status 404 returned error can't find the container with id eec53f96098c128f3af532f30e50756e9b11bc0ba9a29850201985a8e2c90e03 Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.855444 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.969773 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2qts\" (UniqueName: \"kubernetes.io/projected/9022005c-a270-4ad2-b526-10bb125dfff3-kube-api-access-f2qts\") pod \"9022005c-a270-4ad2-b526-10bb125dfff3\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.969846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9022005c-a270-4ad2-b526-10bb125dfff3-operator-scripts\") pod \"9022005c-a270-4ad2-b526-10bb125dfff3\" (UID: \"9022005c-a270-4ad2-b526-10bb125dfff3\") " Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.970568 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9022005c-a270-4ad2-b526-10bb125dfff3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9022005c-a270-4ad2-b526-10bb125dfff3" (UID: "9022005c-a270-4ad2-b526-10bb125dfff3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:12 crc kubenswrapper[4907]: I0226 16:06:12.975465 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9022005c-a270-4ad2-b526-10bb125dfff3-kube-api-access-f2qts" (OuterVolumeSpecName: "kube-api-access-f2qts") pod "9022005c-a270-4ad2-b526-10bb125dfff3" (UID: "9022005c-a270-4ad2-b526-10bb125dfff3"). InnerVolumeSpecName "kube-api-access-f2qts". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.019567 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.025583 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.039666 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.071416 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9v9d\" (UniqueName: \"kubernetes.io/projected/1b61b535-465a-4786-bba7-c33c3c5672a7-kube-api-access-q9v9d\") pod \"1b61b535-465a-4786-bba7-c33c3c5672a7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.071475 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b61b535-465a-4786-bba7-c33c3c5672a7-operator-scripts\") pod \"1b61b535-465a-4786-bba7-c33c3c5672a7\" (UID: \"1b61b535-465a-4786-bba7-c33c3c5672a7\") " Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.071579 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b617db-d4f3-448a-b544-0cd38d51728b-operator-scripts\") pod \"e3b617db-d4f3-448a-b544-0cd38d51728b\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.071835 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dbgl\" (UniqueName: \"kubernetes.io/projected/e3b617db-d4f3-448a-b544-0cd38d51728b-kube-api-access-5dbgl\") pod \"e3b617db-d4f3-448a-b544-0cd38d51728b\" (UID: \"e3b617db-d4f3-448a-b544-0cd38d51728b\") " Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.072334 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2qts\" (UniqueName: \"kubernetes.io/projected/9022005c-a270-4ad2-b526-10bb125dfff3-kube-api-access-f2qts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.072346 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9022005c-a270-4ad2-b526-10bb125dfff3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.074754 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b61b535-465a-4786-bba7-c33c3c5672a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b61b535-465a-4786-bba7-c33c3c5672a7" (UID: "1b61b535-465a-4786-bba7-c33c3c5672a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.076069 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b617db-d4f3-448a-b544-0cd38d51728b-kube-api-access-5dbgl" (OuterVolumeSpecName: "kube-api-access-5dbgl") pod "e3b617db-d4f3-448a-b544-0cd38d51728b" (UID: "e3b617db-d4f3-448a-b544-0cd38d51728b"). InnerVolumeSpecName "kube-api-access-5dbgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.076406 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b617db-d4f3-448a-b544-0cd38d51728b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3b617db-d4f3-448a-b544-0cd38d51728b" (UID: "e3b617db-d4f3-448a-b544-0cd38d51728b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.081190 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b61b535-465a-4786-bba7-c33c3c5672a7-kube-api-access-q9v9d" (OuterVolumeSpecName: "kube-api-access-q9v9d") pod "1b61b535-465a-4786-bba7-c33c3c5672a7" (UID: "1b61b535-465a-4786-bba7-c33c3c5672a7"). InnerVolumeSpecName "kube-api-access-q9v9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.104417 4907 generic.go:334] "Generic (PLEG): container finished" podID="023cbc5f-da0e-4a5e-bc63-18385f44d228" containerID="8e672af27d5f3faf809d9b7e50e114800481d22f3fccd9f003a26b0133dccd51" exitCode=0 Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.104518 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" event={"ID":"023cbc5f-da0e-4a5e-bc63-18385f44d228","Type":"ContainerDied","Data":"8e672af27d5f3faf809d9b7e50e114800481d22f3fccd9f003a26b0133dccd51"} Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.138181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-5hgql" event={"ID":"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f","Type":"ContainerDied","Data":"2e935ecc72d026859daeb8f7e5670aecc42b781dd354a091c07a8eb4beb9dde6"} Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.138222 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e935ecc72d026859daeb8f7e5670aecc42b781dd354a091c07a8eb4beb9dde6" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.138278 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-5hgql" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.173158 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ct9q\" (UniqueName: \"kubernetes.io/projected/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-kube-api-access-7ct9q\") pod \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.173252 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-operator-scripts\") pod \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\" (UID: \"e97b768b-99a2-4a89-b88e-e5ccbbf8d23f\") " Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.173814 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9v9d\" (UniqueName: \"kubernetes.io/projected/1b61b535-465a-4786-bba7-c33c3c5672a7-kube-api-access-q9v9d\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.173831 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b61b535-465a-4786-bba7-c33c3c5672a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.173841 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b617db-d4f3-448a-b544-0cd38d51728b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.173850 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dbgl\" (UniqueName: \"kubernetes.io/projected/e3b617db-d4f3-448a-b544-0cd38d51728b-kube-api-access-5dbgl\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.180284 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-kube-api-access-7ct9q" (OuterVolumeSpecName: "kube-api-access-7ct9q") pod "e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" (UID: "e97b768b-99a2-4a89-b88e-e5ccbbf8d23f"). InnerVolumeSpecName "kube-api-access-7ct9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.180580 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" (UID: "e97b768b-99a2-4a89-b88e-e5ccbbf8d23f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.196030 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" event={"ID":"1b61b535-465a-4786-bba7-c33c3c5672a7","Type":"ContainerDied","Data":"5778777129ea7b014322215f9154f4d0b86180e7b12629aca02b613873f3b3e5"} Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.196063 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5778777129ea7b014322215f9154f4d0b86180e7b12629aca02b613873f3b3e5" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.196099 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-870e-account-create-update-4v7m7" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.207306 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86460377-004c-4908-be6f-328241e8b5fb","Type":"ContainerStarted","Data":"eec53f96098c128f3af532f30e50756e9b11bc0ba9a29850201985a8e2c90e03"} Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.230975 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2e48-account-create-update-8mvk9" event={"ID":"9022005c-a270-4ad2-b526-10bb125dfff3","Type":"ContainerDied","Data":"539e079b2271a050016ea91859cf6254983d307241063136feeef7db1658b543"} Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.231025 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="539e079b2271a050016ea91859cf6254983d307241063136feeef7db1658b543" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.231112 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2e48-account-create-update-8mvk9" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.255877 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0586-account-create-update-p76kt" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.256454 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0586-account-create-update-p76kt" event={"ID":"e3b617db-d4f3-448a-b544-0cd38d51728b","Type":"ContainerDied","Data":"9530360cbde14199aa44f419d8e6c61aacd345997005a7cb6750255f3d1ec4f0"} Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.256500 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9530360cbde14199aa44f419d8e6c61aacd345997005a7cb6750255f3d1ec4f0" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.276468 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ct9q\" (UniqueName: \"kubernetes.io/projected/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-kube-api-access-7ct9q\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:13 crc kubenswrapper[4907]: I0226 16:06:13.276514 4907 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:14 crc kubenswrapper[4907]: I0226 16:06:14.271858 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86460377-004c-4908-be6f-328241e8b5fb","Type":"ContainerStarted","Data":"f9b7a11a4cafa43a53ea5982fdfc488466758c145ac271fc342a9970d6eda394"} Feb 26 16:06:14 crc kubenswrapper[4907]: I0226 16:06:14.805889 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:14 crc kubenswrapper[4907]: I0226 16:06:14.929759 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq4f5\" (UniqueName: \"kubernetes.io/projected/023cbc5f-da0e-4a5e-bc63-18385f44d228-kube-api-access-bq4f5\") pod \"023cbc5f-da0e-4a5e-bc63-18385f44d228\" (UID: \"023cbc5f-da0e-4a5e-bc63-18385f44d228\") " Feb 26 16:06:14 crc kubenswrapper[4907]: I0226 16:06:14.936441 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/023cbc5f-da0e-4a5e-bc63-18385f44d228-kube-api-access-bq4f5" (OuterVolumeSpecName: "kube-api-access-bq4f5") pod "023cbc5f-da0e-4a5e-bc63-18385f44d228" (UID: "023cbc5f-da0e-4a5e-bc63-18385f44d228"). InnerVolumeSpecName "kube-api-access-bq4f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.031768 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq4f5\" (UniqueName: \"kubernetes.io/projected/023cbc5f-da0e-4a5e-bc63-18385f44d228-kube-api-access-bq4f5\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.274099 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-drhn6"] Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.282445 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"86460377-004c-4908-be6f-328241e8b5fb","Type":"ContainerStarted","Data":"2f401d720f9474ef730b550e606a6d30b7e53210b37330569d53586c4b8de43b"} Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.283105 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535360-drhn6"] Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.286555 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" event={"ID":"023cbc5f-da0e-4a5e-bc63-18385f44d228","Type":"ContainerDied","Data":"621aed023f155a1220fdb18cb5c4a8114e7c89cadc98c5aee915d6d6f602caac"} Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.286762 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="621aed023f155a1220fdb18cb5c4a8114e7c89cadc98c5aee915d6d6f602caac" Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.286870 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535366-dqhh6" Feb 26 16:06:15 crc kubenswrapper[4907]: I0226 16:06:15.325838 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.325814909 podStartE2EDuration="4.325814909s" podCreationTimestamp="2026-02-26 16:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:15.317144677 +0000 UTC m=+1437.835706546" watchObservedRunningTime="2026-02-26 16:06:15.325814909 +0000 UTC m=+1437.844376758" Feb 26 16:06:16 crc kubenswrapper[4907]: I0226 16:06:16.137426 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9bb11b5-c26b-4877-bb98-a7a5a22654d6" path="/var/lib/kubelet/pods/e9bb11b5-c26b-4877-bb98-a7a5a22654d6/volumes" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.750022 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.912667 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4b5rh"] Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913134 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9022005c-a270-4ad2-b526-10bb125dfff3" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913158 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9022005c-a270-4ad2-b526-10bb125dfff3" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913170 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b61b535-465a-4786-bba7-c33c3c5672a7" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913178 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b61b535-465a-4786-bba7-c33c3c5672a7" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913190 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="023cbc5f-da0e-4a5e-bc63-18385f44d228" containerName="oc" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913199 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="023cbc5f-da0e-4a5e-bc63-18385f44d228" containerName="oc" Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913215 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913223 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913242 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913249 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913270 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693a0231-a18d-4141-a46f-5911644101a4" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913279 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="693a0231-a18d-4141-a46f-5911644101a4" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: E0226 16:06:17.913302 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b617db-d4f3-448a-b544-0cd38d51728b" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913310 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b617db-d4f3-448a-b544-0cd38d51728b" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913515 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913537 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b617db-d4f3-448a-b544-0cd38d51728b" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913556 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913568 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="023cbc5f-da0e-4a5e-bc63-18385f44d228" containerName="oc" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913607 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b61b535-465a-4786-bba7-c33c3c5672a7" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913621 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9022005c-a270-4ad2-b526-10bb125dfff3" containerName="mariadb-account-create-update" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.913634 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="693a0231-a18d-4141-a46f-5911644101a4" containerName="mariadb-database-create" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.914356 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.919089 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.919176 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bhfw2" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.919228 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.936961 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4b5rh"] Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.982448 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-config-data\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.982506 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.982562 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-scripts\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:17 crc kubenswrapper[4907]: I0226 16:06:17.982636 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj92t\" (UniqueName: \"kubernetes.io/projected/4f76f68d-7dee-4f14-9fb1-e943db5b0533-kube-api-access-qj92t\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.083699 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-config-data\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.083750 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.083803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-scripts\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.083861 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj92t\" (UniqueName: \"kubernetes.io/projected/4f76f68d-7dee-4f14-9fb1-e943db5b0533-kube-api-access-qj92t\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.086720 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.086955 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.093782 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.099203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-config-data\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.104024 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-scripts\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.126198 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj92t\" (UniqueName: \"kubernetes.io/projected/4f76f68d-7dee-4f14-9fb1-e943db5b0533-kube-api-access-qj92t\") pod \"nova-cell0-conductor-db-sync-4b5rh\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.169943 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.235448 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bhfw2" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.244425 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.835268 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4b5rh"] Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.970381 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.974915 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-log" containerID="cri-o://2b1e2a238cf4f1c016f462472860a67cb5f341cf5f7c9b5a6d7a1cc54338beaa" gracePeriod=30 Feb 26 16:06:18 crc kubenswrapper[4907]: I0226 16:06:18.975063 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-httpd" containerID="cri-o://61eed4a9713c166c961dabc5fe450a1963bba8e9756ddc8b568dd385b092c384" gracePeriod=30 Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.443783 4907 generic.go:334] "Generic (PLEG): container finished" podID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerID="2b1e2a238cf4f1c016f462472860a67cb5f341cf5f7c9b5a6d7a1cc54338beaa" exitCode=143 Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.443901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b1253ca-7753-4742-afc4-e786e4dcc6e0","Type":"ContainerDied","Data":"2b1e2a238cf4f1c016f462472860a67cb5f341cf5f7c9b5a6d7a1cc54338beaa"} Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.453024 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" event={"ID":"4f76f68d-7dee-4f14-9fb1-e943db5b0533","Type":"ContainerStarted","Data":"353545cfed0d6a4d3617d794343ff084cbc9348562ba93fc7fa92ec250807d4b"} Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.609703 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.837827 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6db49c6bf7-w2792" Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.949988 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8656797c97-kv5w2"] Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.955951 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8656797c97-kv5w2" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-api" containerID="cri-o://2530bd04fa0ba0bf4cd0cfd6c481e904801ab981339361f5fcbb6a22f455fa21" gracePeriod=30 Feb 26 16:06:19 crc kubenswrapper[4907]: I0226 16:06:19.956434 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8656797c97-kv5w2" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" containerID="cri-o://5e0e4bf5b7cdb36b844161a8e7adefce2272f720c39f3b8506c61b906f2a736d" gracePeriod=30 Feb 26 16:06:20 crc kubenswrapper[4907]: I0226 16:06:20.470559 4907 generic.go:334] "Generic (PLEG): container finished" podID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerID="5e0e4bf5b7cdb36b844161a8e7adefce2272f720c39f3b8506c61b906f2a736d" exitCode=0 Feb 26 16:06:20 crc kubenswrapper[4907]: I0226 16:06:20.470618 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8656797c97-kv5w2" event={"ID":"5a680379-891d-45b5-bfac-04c44ab3e5d4","Type":"ContainerDied","Data":"5e0e4bf5b7cdb36b844161a8e7adefce2272f720c39f3b8506c61b906f2a736d"} Feb 26 16:06:21 crc kubenswrapper[4907]: I0226 16:06:21.486029 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:06:21 crc kubenswrapper[4907]: I0226 16:06:21.486322 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 16:06:21 crc kubenswrapper[4907]: I0226 16:06:21.629500 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:06:21 crc kubenswrapper[4907]: I0226 16:06:21.676921 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 16:06:22 crc kubenswrapper[4907]: I0226 16:06:22.491006 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 16:06:22 crc kubenswrapper[4907]: I0226 16:06:22.502883 4907 generic.go:334] "Generic (PLEG): container finished" podID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerID="61eed4a9713c166c961dabc5fe450a1963bba8e9756ddc8b568dd385b092c384" exitCode=0 Feb 26 16:06:22 crc kubenswrapper[4907]: I0226 16:06:22.502958 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b1253ca-7753-4742-afc4-e786e4dcc6e0","Type":"ContainerDied","Data":"61eed4a9713c166c961dabc5fe450a1963bba8e9756ddc8b568dd385b092c384"} Feb 26 16:06:22 crc kubenswrapper[4907]: I0226 16:06:22.503449 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:06:22 crc kubenswrapper[4907]: I0226 16:06:22.503507 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.459666 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.508993 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-scripts\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509180 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-logs\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509236 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-config-data\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509278 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-combined-ca-bundle\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509307 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-internal-tls-certs\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509331 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-httpd-run\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509350 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zhrb\" (UniqueName: \"kubernetes.io/projected/2b1253ca-7753-4742-afc4-e786e4dcc6e0-kube-api-access-4zhrb\") pod \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\" (UID: \"2b1253ca-7753-4742-afc4-e786e4dcc6e0\") " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.509554 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-logs" (OuterVolumeSpecName: "logs") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.510095 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.512581 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.534693 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.535096 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2b1253ca-7753-4742-afc4-e786e4dcc6e0","Type":"ContainerDied","Data":"28df9e7d43daead14430fe810b52816e3c44f0f7b378b1dd2c357bd86dd166f1"} Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.535136 4907 scope.go:117] "RemoveContainer" containerID="61eed4a9713c166c961dabc5fe450a1963bba8e9756ddc8b568dd385b092c384" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.552260 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1253ca-7753-4742-afc4-e786e4dcc6e0-kube-api-access-4zhrb" (OuterVolumeSpecName: "kube-api-access-4zhrb") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "kube-api-access-4zhrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.552543 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-scripts" (OuterVolumeSpecName: "scripts") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.557779 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.619758 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.620007 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.620071 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2b1253ca-7753-4742-afc4-e786e4dcc6e0-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.620135 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zhrb\" (UniqueName: \"kubernetes.io/projected/2b1253ca-7753-4742-afc4-e786e4dcc6e0-kube-api-access-4zhrb\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.653745 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.689123 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.691042 4907 scope.go:117] "RemoveContainer" containerID="2b1e2a238cf4f1c016f462472860a67cb5f341cf5f7c9b5a6d7a1cc54338beaa" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.697847 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.725879 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.725938 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.725948 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.750921 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-config-data" (OuterVolumeSpecName: "config-data") pod "2b1253ca-7753-4742-afc4-e786e4dcc6e0" (UID: "2b1253ca-7753-4742-afc4-e786e4dcc6e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.827978 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1253ca-7753-4742-afc4-e786e4dcc6e0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.901316 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.910705 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.921532 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:23 crc kubenswrapper[4907]: E0226 16:06:23.921995 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-log" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.922016 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-log" Feb 26 16:06:23 crc kubenswrapper[4907]: E0226 16:06:23.922038 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-httpd" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.922046 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-httpd" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.922338 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-log" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.922364 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-httpd" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.923870 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.935558 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.936133 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 16:06:23 crc kubenswrapper[4907]: I0226 16:06:23.948308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.038698 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-config-data\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.038789 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.038873 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-logs\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.038920 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-scripts\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.038975 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shhhm\" (UniqueName: \"kubernetes.io/projected/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-kube-api-access-shhhm\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.039047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.039109 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.039140 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.143073 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" path="/var/lib/kubelet/pods/2b1253ca-7753-4742-afc4-e786e4dcc6e0/volumes" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.145345 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.145438 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.145733 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-config-data\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.146066 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.146341 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.146476 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-logs\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.146619 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-scripts\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.146868 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shhhm\" (UniqueName: \"kubernetes.io/projected/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-kube-api-access-shhhm\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.146974 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.147089 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-logs\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.147310 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.162518 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.163423 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-config-data\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.164405 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shhhm\" (UniqueName: \"kubernetes.io/projected/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-kube-api-access-shhhm\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.164664 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-scripts\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.178166 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/058a6068-cb3c-42f2-bbe5-7b4dbc71d194-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.198639 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"058a6068-cb3c-42f2-bbe5-7b4dbc71d194\") " pod="openstack/glance-default-internal-api-0" Feb 26 16:06:24 crc kubenswrapper[4907]: I0226 16:06:24.306690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:25 crc kubenswrapper[4907]: I0226 16:06:25.103207 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 16:06:25 crc kubenswrapper[4907]: I0226 16:06:25.627728 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"058a6068-cb3c-42f2-bbe5-7b4dbc71d194","Type":"ContainerStarted","Data":"edc4ae96ba354187d3efc5062294da04acde92a335f321b36abdbd157df89942"} Feb 26 16:06:26 crc kubenswrapper[4907]: I0226 16:06:26.638013 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"058a6068-cb3c-42f2-bbe5-7b4dbc71d194","Type":"ContainerStarted","Data":"2c448fd316d4e80c59c450a2c9cb3455c1f33824cce6234aae3fe49c9cd88e18"} Feb 26 16:06:26 crc kubenswrapper[4907]: I0226 16:06:26.643153 4907 generic.go:334] "Generic (PLEG): container finished" podID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerID="2530bd04fa0ba0bf4cd0cfd6c481e904801ab981339361f5fcbb6a22f455fa21" exitCode=0 Feb 26 16:06:26 crc kubenswrapper[4907]: I0226 16:06:26.643205 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8656797c97-kv5w2" event={"ID":"5a680379-891d-45b5-bfac-04c44ab3e5d4","Type":"ContainerDied","Data":"2530bd04fa0ba0bf4cd0cfd6c481e904801ab981339361f5fcbb6a22f455fa21"} Feb 26 16:06:26 crc kubenswrapper[4907]: I0226 16:06:26.649812 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:06:26 crc kubenswrapper[4907]: I0226 16:06:26.649963 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:06:26 crc kubenswrapper[4907]: I0226 16:06:26.828632 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8656797c97-kv5w2" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.160:9696/\": dial tcp 10.217.0.160:9696: connect: connection refused" Feb 26 16:06:27 crc kubenswrapper[4907]: I0226 16:06:27.098259 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 16:06:27 crc kubenswrapper[4907]: I0226 16:06:27.753492 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:06:28 crc kubenswrapper[4907]: I0226 16:06:28.206134 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.255832 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.327929 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwbd9\" (UniqueName: \"kubernetes.io/projected/5a680379-891d-45b5-bfac-04c44ab3e5d4-kube-api-access-nwbd9\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.327980 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-ovndb-tls-certs\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.328082 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-combined-ca-bundle\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.328108 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-httpd-config\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.328140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-config\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.328161 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-internal-tls-certs\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.328219 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-public-tls-certs\") pod \"5a680379-891d-45b5-bfac-04c44ab3e5d4\" (UID: \"5a680379-891d-45b5-bfac-04c44ab3e5d4\") " Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.346895 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.347416 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a680379-891d-45b5-bfac-04c44ab3e5d4-kube-api-access-nwbd9" (OuterVolumeSpecName: "kube-api-access-nwbd9") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "kube-api-access-nwbd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.434020 4907 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.434078 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwbd9\" (UniqueName: \"kubernetes.io/projected/5a680379-891d-45b5-bfac-04c44ab3e5d4-kube-api-access-nwbd9\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.436291 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.498732 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-config" (OuterVolumeSpecName: "config") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.520371 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.528736 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.534947 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5a680379-891d-45b5-bfac-04c44ab3e5d4" (UID: "5a680379-891d-45b5-bfac-04c44ab3e5d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.536056 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.536090 4907 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.536102 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.536112 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.536120 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a680379-891d-45b5-bfac-04c44ab3e5d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.731693 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8656797c97-kv5w2" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.732119 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8656797c97-kv5w2" event={"ID":"5a680379-891d-45b5-bfac-04c44ab3e5d4","Type":"ContainerDied","Data":"a2760191b1a16549429a54c2da36f05b3029d2e4e7b2805ce240cb3e3102f609"} Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.732164 4907 scope.go:117] "RemoveContainer" containerID="5e0e4bf5b7cdb36b844161a8e7adefce2272f720c39f3b8506c61b906f2a736d" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.733494 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" event={"ID":"4f76f68d-7dee-4f14-9fb1-e943db5b0533","Type":"ContainerStarted","Data":"b2ca0f3286f6aeae72b4c27b418e941b8810b0b4f88bfdc3d64534da2f76ef0a"} Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.736513 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"058a6068-cb3c-42f2-bbe5-7b4dbc71d194","Type":"ContainerStarted","Data":"a455b47ecb275609a8a5fb73a6f44f5f94a6c4f415075a94933e565fa18c2ffd"} Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.764722 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" podStartSLOduration=2.904177101 podStartE2EDuration="17.764706374s" podCreationTimestamp="2026-02-26 16:06:17 +0000 UTC" firstStartedPulling="2026-02-26 16:06:18.840271121 +0000 UTC m=+1441.358832970" lastFinishedPulling="2026-02-26 16:06:33.700800394 +0000 UTC m=+1456.219362243" observedRunningTime="2026-02-26 16:06:34.7555802 +0000 UTC m=+1457.274142039" watchObservedRunningTime="2026-02-26 16:06:34.764706374 +0000 UTC m=+1457.283268223" Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.787666 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8656797c97-kv5w2"] Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.797610 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8656797c97-kv5w2"] Feb 26 16:06:34 crc kubenswrapper[4907]: I0226 16:06:34.833004 4907 scope.go:117] "RemoveContainer" containerID="2530bd04fa0ba0bf4cd0cfd6c481e904801ab981339361f5fcbb6a22f455fa21" Feb 26 16:06:35 crc kubenswrapper[4907]: I0226 16:06:35.769624 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.76960165 podStartE2EDuration="12.76960165s" podCreationTimestamp="2026-02-26 16:06:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:35.766115464 +0000 UTC m=+1458.284677323" watchObservedRunningTime="2026-02-26 16:06:35.76960165 +0000 UTC m=+1458.288163509" Feb 26 16:06:36 crc kubenswrapper[4907]: I0226 16:06:36.137580 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" path="/var/lib/kubelet/pods/5a680379-891d-45b5-bfac-04c44ab3e5d4/volumes" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:37.749057 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:37.749132 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:37.749957 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"0e9ea68de0c1e921e9ed4ee0e299561d11e0b96c063a8d42fd8a0ea1f0193bee"} pod="openstack/horizon-6fccfb8496-4tqhr" containerMessage="Container horizon failed startup probe, will be restarted" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:37.750003 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" containerID="cri-o://0e9ea68de0c1e921e9ed4ee0e299561d11e0b96c063a8d42fd8a0ea1f0193bee" gracePeriod=30 Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.155110 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.155173 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.155846 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="horizon" containerStatusID={"Type":"cri-o","ID":"a09830ab9c067f94a8fe072a6ed8e9195e12c6c572d7b1467cb8afc38542fb22"} pod="openstack/horizon-76d88967b8-wmzcw" containerMessage="Container horizon failed startup probe, will be restarted" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.155877 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" containerID="cri-o://a09830ab9c067f94a8fe072a6ed8e9195e12c6c572d7b1467cb8afc38542fb22" gracePeriod=30 Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.550326 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712299 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-scripts\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712695 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ksv6\" (UniqueName: \"kubernetes.io/projected/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-kube-api-access-5ksv6\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712775 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-log-httpd\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-config-data\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712832 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-sg-core-conf-yaml\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712889 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-run-httpd\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.712988 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-combined-ca-bundle\") pod \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\" (UID: \"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b\") " Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.713860 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.713977 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.722347 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-scripts" (OuterVolumeSpecName: "scripts") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.728927 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-kube-api-access-5ksv6" (OuterVolumeSpecName: "kube-api-access-5ksv6") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "kube-api-access-5ksv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.799034 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.804521 4907 generic.go:334] "Generic (PLEG): container finished" podID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerID="c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f" exitCode=137 Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.804801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerDied","Data":"c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f"} Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.804884 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a114e8dd-3cb1-4b1a-8f49-48b99c39da3b","Type":"ContainerDied","Data":"4f6a71d6fd6a3e58ce80a4d756b4969f440aaec2e5fa2d4cc31613127f2b96b4"} Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.804953 4907 scope.go:117] "RemoveContainer" containerID="c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.805157 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.816449 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.816479 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ksv6\" (UniqueName: \"kubernetes.io/projected/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-kube-api-access-5ksv6\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.816490 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.816498 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.816512 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.874776 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.888774 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-config-data" (OuterVolumeSpecName: "config-data") pod "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" (UID: "a114e8dd-3cb1-4b1a-8f49-48b99c39da3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.894381 4907 scope.go:117] "RemoveContainer" containerID="e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.921288 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.921324 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.945558 4907 scope.go:117] "RemoveContainer" containerID="4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849" Feb 26 16:06:38 crc kubenswrapper[4907]: I0226 16:06:38.978277 4907 scope.go:117] "RemoveContainer" containerID="191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.001745 4907 scope.go:117] "RemoveContainer" containerID="c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.002641 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f\": container with ID starting with c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f not found: ID does not exist" containerID="c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.002697 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f"} err="failed to get container status \"c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f\": rpc error: code = NotFound desc = could not find container \"c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f\": container with ID starting with c59ed0d9b2419de1e69d5cb40eb488a6e7a9eeffe8226fc39da6e0a48790911f not found: ID does not exist" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.002725 4907 scope.go:117] "RemoveContainer" containerID="e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.003077 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2\": container with ID starting with e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2 not found: ID does not exist" containerID="e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.003128 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2"} err="failed to get container status \"e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2\": rpc error: code = NotFound desc = could not find container \"e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2\": container with ID starting with e52ae1e1152905e752807f8c44657b795a66f16e7ed728b7d70f51708d7de6b2 not found: ID does not exist" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.003146 4907 scope.go:117] "RemoveContainer" containerID="4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.003457 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849\": container with ID starting with 4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849 not found: ID does not exist" containerID="4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.003602 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849"} err="failed to get container status \"4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849\": rpc error: code = NotFound desc = could not find container \"4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849\": container with ID starting with 4fe7393b6c3f68c93befaf6b19d674b51c8a99443a2fa24cad7b655b7d3b5849 not found: ID does not exist" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.003626 4907 scope.go:117] "RemoveContainer" containerID="191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.004032 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3\": container with ID starting with 191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3 not found: ID does not exist" containerID="191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.004071 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3"} err="failed to get container status \"191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3\": rpc error: code = NotFound desc = could not find container \"191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3\": container with ID starting with 191f82e8d71c55d97bd6521f30700a2a6dd03522e595c49eb8512eaeed42b5f3 not found: ID does not exist" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.145880 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.155606 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176025 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.176459 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="sg-core" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176481 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="sg-core" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.176504 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-notification-agent" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176513 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-notification-agent" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.176532 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-central-agent" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176540 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-central-agent" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.176577 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="proxy-httpd" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176601 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="proxy-httpd" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.176633 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176642 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" Feb 26 16:06:39 crc kubenswrapper[4907]: E0226 16:06:39.176651 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-api" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176658 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-api" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176868 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-central-agent" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176891 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="ceilometer-notification-agent" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176903 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="sg-core" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176918 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-api" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176928 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" containerName="proxy-httpd" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.176947 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a680379-891d-45b5-bfac-04c44ab3e5d4" containerName="neutron-httpd" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.180509 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.187820 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.188112 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.212153 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.331439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkwz\" (UniqueName: \"kubernetes.io/projected/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-kube-api-access-cbkwz\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.331824 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.331963 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-log-httpd\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.332188 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-run-httpd\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.332327 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-config-data\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.332437 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-scripts\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.332583 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434441 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434499 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-log-httpd\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434626 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-run-httpd\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434678 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-config-data\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434703 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-scripts\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434733 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.434796 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkwz\" (UniqueName: \"kubernetes.io/projected/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-kube-api-access-cbkwz\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.435744 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-log-httpd\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.435906 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-run-httpd\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.439555 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.440416 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-config-data\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.459364 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-scripts\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.461979 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.463716 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkwz\" (UniqueName: \"kubernetes.io/projected/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-kube-api-access-cbkwz\") pod \"ceilometer-0\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " pod="openstack/ceilometer-0" Feb 26 16:06:39 crc kubenswrapper[4907]: I0226 16:06:39.565690 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:40 crc kubenswrapper[4907]: I0226 16:06:40.048497 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:40 crc kubenswrapper[4907]: I0226 16:06:40.137341 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a114e8dd-3cb1-4b1a-8f49-48b99c39da3b" path="/var/lib/kubelet/pods/a114e8dd-3cb1-4b1a-8f49-48b99c39da3b/volumes" Feb 26 16:06:40 crc kubenswrapper[4907]: I0226 16:06:40.827286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerStarted","Data":"2a7c93ea11857b9884eaff9517b046bb0214f6d849f4cc7dcfc949f9e7b57956"} Feb 26 16:06:41 crc kubenswrapper[4907]: I0226 16:06:41.837226 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerStarted","Data":"21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231"} Feb 26 16:06:41 crc kubenswrapper[4907]: I0226 16:06:41.837268 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerStarted","Data":"d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd"} Feb 26 16:06:43 crc kubenswrapper[4907]: I0226 16:06:43.858749 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerStarted","Data":"506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2"} Feb 26 16:06:44 crc kubenswrapper[4907]: I0226 16:06:44.307715 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:44 crc kubenswrapper[4907]: I0226 16:06:44.308111 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:44 crc kubenswrapper[4907]: I0226 16:06:44.349570 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:44 crc kubenswrapper[4907]: I0226 16:06:44.379519 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:44 crc kubenswrapper[4907]: I0226 16:06:44.867395 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:44 crc kubenswrapper[4907]: I0226 16:06:44.867702 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:45 crc kubenswrapper[4907]: I0226 16:06:45.878717 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerStarted","Data":"9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867"} Feb 26 16:06:45 crc kubenswrapper[4907]: I0226 16:06:45.879181 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:06:45 crc kubenswrapper[4907]: I0226 16:06:45.905564 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.182660569 podStartE2EDuration="6.90554508s" podCreationTimestamp="2026-02-26 16:06:39 +0000 UTC" firstStartedPulling="2026-02-26 16:06:40.049167105 +0000 UTC m=+1462.567728954" lastFinishedPulling="2026-02-26 16:06:44.772051606 +0000 UTC m=+1467.290613465" observedRunningTime="2026-02-26 16:06:45.900083066 +0000 UTC m=+1468.418644945" watchObservedRunningTime="2026-02-26 16:06:45.90554508 +0000 UTC m=+1468.424106929" Feb 26 16:06:46 crc kubenswrapper[4907]: I0226 16:06:46.887209 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:06:46 crc kubenswrapper[4907]: I0226 16:06:46.887548 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:06:47 crc kubenswrapper[4907]: I0226 16:06:47.948244 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:47 crc kubenswrapper[4907]: I0226 16:06:47.948607 4907 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 16:06:47 crc kubenswrapper[4907]: I0226 16:06:47.971638 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.515974 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.516407 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="proxy-httpd" containerID="cri-o://9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867" gracePeriod=30 Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.516570 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="sg-core" containerID="cri-o://506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2" gracePeriod=30 Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.516377 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-central-agent" containerID="cri-o://21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231" gracePeriod=30 Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.516637 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-notification-agent" containerID="cri-o://d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd" gracePeriod=30 Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.914183 4907 generic.go:334] "Generic (PLEG): container finished" podID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerID="9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867" exitCode=0 Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.914527 4907 generic.go:334] "Generic (PLEG): container finished" podID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerID="506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2" exitCode=2 Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.914356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerDied","Data":"9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867"} Feb 26 16:06:49 crc kubenswrapper[4907]: I0226 16:06:49.914573 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerDied","Data":"506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2"} Feb 26 16:06:50 crc kubenswrapper[4907]: I0226 16:06:50.927992 4907 generic.go:334] "Generic (PLEG): container finished" podID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerID="d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd" exitCode=0 Feb 26 16:06:50 crc kubenswrapper[4907]: I0226 16:06:50.928041 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerDied","Data":"d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd"} Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.515958 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634114 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-combined-ca-bundle\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634204 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-config-data\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634380 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-log-httpd\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634459 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-run-httpd\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634485 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-sg-core-conf-yaml\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634512 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-scripts\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634550 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbkwz\" (UniqueName: \"kubernetes.io/projected/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-kube-api-access-cbkwz\") pod \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\" (UID: \"38d2aff1-77e3-43d1-b41b-f5f41346d3ec\") " Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634794 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.634907 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.635245 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.635265 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.648192 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-kube-api-access-cbkwz" (OuterVolumeSpecName: "kube-api-access-cbkwz") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "kube-api-access-cbkwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.657386 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-scripts" (OuterVolumeSpecName: "scripts") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.672153 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.723968 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.737062 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.737098 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.737109 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbkwz\" (UniqueName: \"kubernetes.io/projected/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-kube-api-access-cbkwz\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.737121 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.742620 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-config-data" (OuterVolumeSpecName: "config-data") pod "38d2aff1-77e3-43d1-b41b-f5f41346d3ec" (UID: "38d2aff1-77e3-43d1-b41b-f5f41346d3ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.838898 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38d2aff1-77e3-43d1-b41b-f5f41346d3ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.953802 4907 generic.go:334] "Generic (PLEG): container finished" podID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerID="21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231" exitCode=0 Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.953865 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerDied","Data":"21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231"} Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.953902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"38d2aff1-77e3-43d1-b41b-f5f41346d3ec","Type":"ContainerDied","Data":"2a7c93ea11857b9884eaff9517b046bb0214f6d849f4cc7dcfc949f9e7b57956"} Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.953918 4907 scope.go:117] "RemoveContainer" containerID="9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.954029 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.958574 4907 generic.go:334] "Generic (PLEG): container finished" podID="4f76f68d-7dee-4f14-9fb1-e943db5b0533" containerID="b2ca0f3286f6aeae72b4c27b418e941b8810b0b4f88bfdc3d64534da2f76ef0a" exitCode=0 Feb 26 16:06:52 crc kubenswrapper[4907]: I0226 16:06:52.958626 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" event={"ID":"4f76f68d-7dee-4f14-9fb1-e943db5b0533","Type":"ContainerDied","Data":"b2ca0f3286f6aeae72b4c27b418e941b8810b0b4f88bfdc3d64534da2f76ef0a"} Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.000690 4907 scope.go:117] "RemoveContainer" containerID="506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.005570 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.022801 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.030580 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.032380 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-central-agent" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.032519 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-central-agent" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.032637 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-notification-agent" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.032722 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-notification-agent" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.032810 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="sg-core" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.032887 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="sg-core" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.032994 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="proxy-httpd" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.033073 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="proxy-httpd" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.033378 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-central-agent" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.033474 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="sg-core" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.033568 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="ceilometer-notification-agent" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.033663 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" containerName="proxy-httpd" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.036437 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.041717 4907 scope.go:117] "RemoveContainer" containerID="d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.044272 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.044374 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.048071 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.075071 4907 scope.go:117] "RemoveContainer" containerID="21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.093262 4907 scope.go:117] "RemoveContainer" containerID="9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.093929 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867\": container with ID starting with 9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867 not found: ID does not exist" containerID="9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.093967 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867"} err="failed to get container status \"9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867\": rpc error: code = NotFound desc = could not find container \"9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867\": container with ID starting with 9c55be8865acf469e9565c8555ecf67b7f1be61d092324b8fda3e9e976a1f867 not found: ID does not exist" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.093993 4907 scope.go:117] "RemoveContainer" containerID="506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.094192 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2\": container with ID starting with 506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2 not found: ID does not exist" containerID="506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.094223 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2"} err="failed to get container status \"506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2\": rpc error: code = NotFound desc = could not find container \"506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2\": container with ID starting with 506f880971c91bbe11b1b9dd4adcd4d072d7857c5fb27e6198ada595656dc3e2 not found: ID does not exist" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.094239 4907 scope.go:117] "RemoveContainer" containerID="d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.094553 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd\": container with ID starting with d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd not found: ID does not exist" containerID="d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.094579 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd"} err="failed to get container status \"d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd\": rpc error: code = NotFound desc = could not find container \"d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd\": container with ID starting with d6b69445e3c5e21b6865665d9f215325fd461469837023d46f1a59f7b55b98bd not found: ID does not exist" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.094613 4907 scope.go:117] "RemoveContainer" containerID="21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231" Feb 26 16:06:53 crc kubenswrapper[4907]: E0226 16:06:53.094969 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231\": container with ID starting with 21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231 not found: ID does not exist" containerID="21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.095008 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231"} err="failed to get container status \"21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231\": rpc error: code = NotFound desc = could not find container \"21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231\": container with ID starting with 21c6bfd6fefe36774c5c076a4d4b11d2448f496f74161be795e552e78856e231 not found: ID does not exist" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.144821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-config-data\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.144900 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9g89\" (UniqueName: \"kubernetes.io/projected/614d4398-61e7-4159-bcf2-a75e8c2c91fb-kube-api-access-j9g89\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.145006 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.145077 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-run-httpd\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.145100 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-scripts\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.145120 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-log-httpd\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.145176 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.247207 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.247286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-run-httpd\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.247306 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-log-httpd\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.247772 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-run-httpd\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.248085 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-scripts\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.248280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-log-httpd\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.248294 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.248655 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-config-data\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.248772 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9g89\" (UniqueName: \"kubernetes.io/projected/614d4398-61e7-4159-bcf2-a75e8c2c91fb-kube-api-access-j9g89\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.252941 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.253920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-scripts\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.254550 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-config-data\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.254836 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.255461 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: i/o timeout" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.256566 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="2b1253ca-7753-4742-afc4-e786e4dcc6e0" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.271722 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9g89\" (UniqueName: \"kubernetes.io/projected/614d4398-61e7-4159-bcf2-a75e8c2c91fb-kube-api-access-j9g89\") pod \"ceilometer-0\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.353333 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.827678 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:06:53 crc kubenswrapper[4907]: W0226 16:06:53.830165 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod614d4398_61e7_4159_bcf2_a75e8c2c91fb.slice/crio-4d66ce7f890508fb72c7a79a35a2b2720ae4f7bb0d238b1256d1366905f7ed00 WatchSource:0}: Error finding container 4d66ce7f890508fb72c7a79a35a2b2720ae4f7bb0d238b1256d1366905f7ed00: Status 404 returned error can't find the container with id 4d66ce7f890508fb72c7a79a35a2b2720ae4f7bb0d238b1256d1366905f7ed00 Feb 26 16:06:53 crc kubenswrapper[4907]: I0226 16:06:53.970954 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerStarted","Data":"4d66ce7f890508fb72c7a79a35a2b2720ae4f7bb0d238b1256d1366905f7ed00"} Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.146609 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38d2aff1-77e3-43d1-b41b-f5f41346d3ec" path="/var/lib/kubelet/pods/38d2aff1-77e3-43d1-b41b-f5f41346d3ec/volumes" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.491465 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.581159 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-config-data\") pod \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.581216 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-combined-ca-bundle\") pod \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.581347 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-scripts\") pod \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.581414 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj92t\" (UniqueName: \"kubernetes.io/projected/4f76f68d-7dee-4f14-9fb1-e943db5b0533-kube-api-access-qj92t\") pod \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\" (UID: \"4f76f68d-7dee-4f14-9fb1-e943db5b0533\") " Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.589721 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-scripts" (OuterVolumeSpecName: "scripts") pod "4f76f68d-7dee-4f14-9fb1-e943db5b0533" (UID: "4f76f68d-7dee-4f14-9fb1-e943db5b0533"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.612449 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f76f68d-7dee-4f14-9fb1-e943db5b0533" (UID: "4f76f68d-7dee-4f14-9fb1-e943db5b0533"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.612897 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f76f68d-7dee-4f14-9fb1-e943db5b0533-kube-api-access-qj92t" (OuterVolumeSpecName: "kube-api-access-qj92t") pod "4f76f68d-7dee-4f14-9fb1-e943db5b0533" (UID: "4f76f68d-7dee-4f14-9fb1-e943db5b0533"). InnerVolumeSpecName "kube-api-access-qj92t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.617848 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-config-data" (OuterVolumeSpecName: "config-data") pod "4f76f68d-7dee-4f14-9fb1-e943db5b0533" (UID: "4f76f68d-7dee-4f14-9fb1-e943db5b0533"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.683717 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.684111 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.684259 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj92t\" (UniqueName: \"kubernetes.io/projected/4f76f68d-7dee-4f14-9fb1-e943db5b0533-kube-api-access-qj92t\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.684442 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f76f68d-7dee-4f14-9fb1-e943db5b0533-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:06:54 crc kubenswrapper[4907]: I0226 16:06:54.996614 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerStarted","Data":"9220e6ca6e5e2e1f96dc9d7844d9da87559289e407da3cae6f6f0dcf53af0469"} Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.005541 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" event={"ID":"4f76f68d-7dee-4f14-9fb1-e943db5b0533","Type":"ContainerDied","Data":"353545cfed0d6a4d3617d794343ff084cbc9348562ba93fc7fa92ec250807d4b"} Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.005576 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="353545cfed0d6a4d3617d794343ff084cbc9348562ba93fc7fa92ec250807d4b" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.006003 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4b5rh" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.144040 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:06:55 crc kubenswrapper[4907]: E0226 16:06:55.145027 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f76f68d-7dee-4f14-9fb1-e943db5b0533" containerName="nova-cell0-conductor-db-sync" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.145071 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f76f68d-7dee-4f14-9fb1-e943db5b0533" containerName="nova-cell0-conductor-db-sync" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.145514 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f76f68d-7dee-4f14-9fb1-e943db5b0533" containerName="nova-cell0-conductor-db-sync" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.146571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.163835 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.164154 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bhfw2" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.184851 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.300461 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znd7q\" (UniqueName: \"kubernetes.io/projected/ada06759-c75a-49d4-9bbc-ef11e888b457-kube-api-access-znd7q\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.300537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada06759-c75a-49d4-9bbc-ef11e888b457-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.300687 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada06759-c75a-49d4-9bbc-ef11e888b457-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.402738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada06759-c75a-49d4-9bbc-ef11e888b457-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.402862 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada06759-c75a-49d4-9bbc-ef11e888b457-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.402975 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znd7q\" (UniqueName: \"kubernetes.io/projected/ada06759-c75a-49d4-9bbc-ef11e888b457-kube-api-access-znd7q\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.406947 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada06759-c75a-49d4-9bbc-ef11e888b457-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.410256 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada06759-c75a-49d4-9bbc-ef11e888b457-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.422379 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znd7q\" (UniqueName: \"kubernetes.io/projected/ada06759-c75a-49d4-9bbc-ef11e888b457-kube-api-access-znd7q\") pod \"nova-cell0-conductor-0\" (UID: \"ada06759-c75a-49d4-9bbc-ef11e888b457\") " pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:55 crc kubenswrapper[4907]: I0226 16:06:55.527122 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:56 crc kubenswrapper[4907]: I0226 16:06:56.029056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerStarted","Data":"c857c5f82d8c70c13bf2c5eb1d3d2caed95365a3f1d0ed75a49e357e7fcb708c"} Feb 26 16:06:56 crc kubenswrapper[4907]: I0226 16:06:56.029447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerStarted","Data":"fb844999d61fcc35ddd9efca649f4b958c4bffe574f9a722d14b83e7486203ab"} Feb 26 16:06:56 crc kubenswrapper[4907]: W0226 16:06:56.049323 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podada06759_c75a_49d4_9bbc_ef11e888b457.slice/crio-21e336f5643748b8228711c9994bfbce227b85abebe3fdc8968ed10de0a9619e WatchSource:0}: Error finding container 21e336f5643748b8228711c9994bfbce227b85abebe3fdc8968ed10de0a9619e: Status 404 returned error can't find the container with id 21e336f5643748b8228711c9994bfbce227b85abebe3fdc8968ed10de0a9619e Feb 26 16:06:56 crc kubenswrapper[4907]: I0226 16:06:56.063003 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 16:06:57 crc kubenswrapper[4907]: I0226 16:06:57.044244 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ada06759-c75a-49d4-9bbc-ef11e888b457","Type":"ContainerStarted","Data":"c096e3fc74fd07c98289f8e9826a7e11206e1cefd432a4464af8d59a8be7006a"} Feb 26 16:06:57 crc kubenswrapper[4907]: I0226 16:06:57.044861 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 16:06:57 crc kubenswrapper[4907]: I0226 16:06:57.044877 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ada06759-c75a-49d4-9bbc-ef11e888b457","Type":"ContainerStarted","Data":"21e336f5643748b8228711c9994bfbce227b85abebe3fdc8968ed10de0a9619e"} Feb 26 16:06:57 crc kubenswrapper[4907]: I0226 16:06:57.073256 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.073237814 podStartE2EDuration="2.073237814s" podCreationTimestamp="2026-02-26 16:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:06:57.060804809 +0000 UTC m=+1479.579366668" watchObservedRunningTime="2026-02-26 16:06:57.073237814 +0000 UTC m=+1479.591799663" Feb 26 16:06:58 crc kubenswrapper[4907]: I0226 16:06:58.060183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerStarted","Data":"263981f6f729e5e0643daa3a3a6cdb7958182b643e5936300173d35ab977bb96"} Feb 26 16:06:58 crc kubenswrapper[4907]: I0226 16:06:58.091549 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.454447522 podStartE2EDuration="5.091520486s" podCreationTimestamp="2026-02-26 16:06:53 +0000 UTC" firstStartedPulling="2026-02-26 16:06:53.832561105 +0000 UTC m=+1476.351122954" lastFinishedPulling="2026-02-26 16:06:57.469634069 +0000 UTC m=+1479.988195918" observedRunningTime="2026-02-26 16:06:58.084204107 +0000 UTC m=+1480.602765976" watchObservedRunningTime="2026-02-26 16:06:58.091520486 +0000 UTC m=+1480.610082335" Feb 26 16:06:59 crc kubenswrapper[4907]: I0226 16:06:59.070352 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:07:05 crc kubenswrapper[4907]: I0226 16:07:05.560008 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.106005 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-6bjsz"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.110958 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.120899 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.121375 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.165024 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6bjsz"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.233821 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krjcf\" (UniqueName: \"kubernetes.io/projected/495e976f-e5c3-4fe4-9a08-12e01970b48d-kube-api-access-krjcf\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.233912 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-config-data\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.233995 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.234128 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-scripts\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.335718 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-scripts\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.335790 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krjcf\" (UniqueName: \"kubernetes.io/projected/495e976f-e5c3-4fe4-9a08-12e01970b48d-kube-api-access-krjcf\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.335836 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-config-data\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.335881 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.342477 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-config-data\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.344817 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-scripts\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.354365 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.365482 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.370028 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.377087 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.394641 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krjcf\" (UniqueName: \"kubernetes.io/projected/495e976f-e5c3-4fe4-9a08-12e01970b48d-kube-api-access-krjcf\") pod \"nova-cell0-cell-mapping-6bjsz\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.416651 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.434981 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.436127 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.437391 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.437441 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc33a845-b7a2-4d10-93a7-23f788917d59-logs\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.437513 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f2sm\" (UniqueName: \"kubernetes.io/projected/dc33a845-b7a2-4d10-93a7-23f788917d59-kube-api-access-2f2sm\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.437551 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-config-data\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.464693 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.470238 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.489972 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.500960 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.502288 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.572829 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.656519 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-config-data\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.656624 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f2sm\" (UniqueName: \"kubernetes.io/projected/dc33a845-b7a2-4d10-93a7-23f788917d59-kube-api-access-2f2sm\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.656672 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bksrm\" (UniqueName: \"kubernetes.io/projected/640a222e-f92b-468e-bb3e-83de9bae97d4-kube-api-access-bksrm\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.656778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-config-data\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.656884 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.656940 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.657012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc33a845-b7a2-4d10-93a7-23f788917d59-logs\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.674091 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc33a845-b7a2-4d10-93a7-23f788917d59-logs\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.689458 4907 scope.go:117] "RemoveContainer" containerID="22bd2d9f71b46a4b332ba0298c4dd9f15469626f8dd9e9a0f20e7e2952e083f9" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.720159 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-config-data\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.747046 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.757833 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f2sm\" (UniqueName: \"kubernetes.io/projected/dc33a845-b7a2-4d10-93a7-23f788917d59-kube-api-access-2f2sm\") pod \"nova-api-0\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " pod="openstack/nova-api-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.760437 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.761400 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-config-data\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.761459 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.761554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bksrm\" (UniqueName: \"kubernetes.io/projected/640a222e-f92b-468e-bb3e-83de9bae97d4-kube-api-access-bksrm\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.761709 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.761804 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.761861 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qlk\" (UniqueName: \"kubernetes.io/projected/a4a6fb6f-387d-4765-8360-1570fa74a16e-kube-api-access-j7qlk\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.788306 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-config-data\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.790489 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.810753 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bksrm\" (UniqueName: \"kubernetes.io/projected/640a222e-f92b-468e-bb3e-83de9bae97d4-kube-api-access-bksrm\") pod \"nova-scheduler-0\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.863867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.863960 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qlk\" (UniqueName: \"kubernetes.io/projected/a4a6fb6f-387d-4765-8360-1570fa74a16e-kube-api-access-j7qlk\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.864099 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.905283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.909199 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qlk\" (UniqueName: \"kubernetes.io/projected/a4a6fb6f-387d-4765-8360-1570fa74a16e-kube-api-access-j7qlk\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.921215 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.926093 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.928119 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.941695 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.961840 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.984650 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-zlwx8"] Feb 26 16:07:06 crc kubenswrapper[4907]: I0226 16:07:06.986477 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:06.996432 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-zlwx8"] Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.060714 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.065293 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077015 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-config-data\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077089 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-869tx\" (UniqueName: \"kubernetes.io/projected/afe4d1a5-bb07-4b94-8e61-fb4602942978-kube-api-access-869tx\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077148 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-svc\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077177 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5tf4\" (UniqueName: \"kubernetes.io/projected/9d796571-e8ef-42f7-bf25-96d758b8b32b-kube-api-access-p5tf4\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077197 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077249 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4d1a5-bb07-4b94-8e61-fb4602942978-logs\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077297 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077317 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-config\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.077345 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.090280 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179365 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-svc\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179638 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5tf4\" (UniqueName: \"kubernetes.io/projected/9d796571-e8ef-42f7-bf25-96d758b8b32b-kube-api-access-p5tf4\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179658 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179696 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179716 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4d1a5-bb07-4b94-8e61-fb4602942978-logs\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179732 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179747 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-config\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179772 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-config-data\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.179880 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-869tx\" (UniqueName: \"kubernetes.io/projected/afe4d1a5-bb07-4b94-8e61-fb4602942978-kube-api-access-869tx\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.180875 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-svc\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.181499 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.184763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.187541 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.193119 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-config\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.203280 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4d1a5-bb07-4b94-8e61-fb4602942978-logs\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.223299 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-869tx\" (UniqueName: \"kubernetes.io/projected/afe4d1a5-bb07-4b94-8e61-fb4602942978-kube-api-access-869tx\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.225513 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-config-data\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.232809 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5tf4\" (UniqueName: \"kubernetes.io/projected/9d796571-e8ef-42f7-bf25-96d758b8b32b-kube-api-access-p5tf4\") pod \"dnsmasq-dns-bccf8f775-zlwx8\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.233484 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.268792 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.326195 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.543392 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-6bjsz"] Feb 26 16:07:07 crc kubenswrapper[4907]: I0226 16:07:07.876365 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.096966 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l2rsl"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.098497 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.101871 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.102404 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.124317 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l2rsl"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.182504 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.208390 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj2xx\" (UniqueName: \"kubernetes.io/projected/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-kube-api-access-mj2xx\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.208535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-config-data\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.208767 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-scripts\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.208820 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.209580 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.307872 4907 generic.go:334] "Generic (PLEG): container finished" podID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerID="0e9ea68de0c1e921e9ed4ee0e299561d11e0b96c063a8d42fd8a0ea1f0193bee" exitCode=137 Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.308222 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerDied","Data":"0e9ea68de0c1e921e9ed4ee0e299561d11e0b96c063a8d42fd8a0ea1f0193bee"} Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.308264 4907 scope.go:117] "RemoveContainer" containerID="5f606b9ab89532e105117c7cf76e6d48e275002733a615d726e58c1777c18aad" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.315895 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4a6fb6f-387d-4765-8360-1570fa74a16e","Type":"ContainerStarted","Data":"d48724b39b980e13296deadb81130fe0ac074c16990e2177f92168b64ed10d00"} Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.323036 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-scripts\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.323098 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.323219 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj2xx\" (UniqueName: \"kubernetes.io/projected/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-kube-api-access-mj2xx\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.323034 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6bjsz" event={"ID":"495e976f-e5c3-4fe4-9a08-12e01970b48d","Type":"ContainerStarted","Data":"fe36a4a7da824268ac681e7da109253b6e256198fd7296219b2d6cc025d3f6ab"} Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.323631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6bjsz" event={"ID":"495e976f-e5c3-4fe4-9a08-12e01970b48d","Type":"ContainerStarted","Data":"1e0b09de6070c353a9c4a642f8c3f32d92b14797e88ee73a084bfae081c1a537"} Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.325052 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.326583 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-config-data\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.327700 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc33a845-b7a2-4d10-93a7-23f788917d59","Type":"ContainerStarted","Data":"3252640550de7702d53f928157cf520bf9d1b8ae332a5e9486116155d80a7dc6"} Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.348032 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.348776 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-scripts\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.349820 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-config-data\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.353069 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"640a222e-f92b-468e-bb3e-83de9bae97d4","Type":"ContainerStarted","Data":"2c8be294b62b65221d7f2aaa5f5bc97e9db87135b4c67fa553704b9337907d32"} Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.378110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj2xx\" (UniqueName: \"kubernetes.io/projected/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-kube-api-access-mj2xx\") pod \"nova-cell1-conductor-db-sync-l2rsl\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.388027 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-zlwx8"] Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.401216 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-6bjsz" podStartSLOduration=2.401196551 podStartE2EDuration="2.401196551s" podCreationTimestamp="2026-02-26 16:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:08.371784691 +0000 UTC m=+1490.890346540" watchObservedRunningTime="2026-02-26 16:07:08.401196551 +0000 UTC m=+1490.919758400" Feb 26 16:07:08 crc kubenswrapper[4907]: I0226 16:07:08.439866 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.023239 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l2rsl"] Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.410925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" event={"ID":"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2","Type":"ContainerStarted","Data":"c3b1f7c4476fdadc4251d9974969d30ed70f9f124e466d21df11ac454b81ccc0"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.411378 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" event={"ID":"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2","Type":"ContainerStarted","Data":"92dd714eaeef7bb8bb57879c4401ab45f6c136d0363d7c1819ac5880ac8af2da"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.417809 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerStarted","Data":"9e39d9243d4cdfe57e174b7503dc46aaf0fae6d591c7d87a7c2c19a92a84a500"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.429925 4907 generic.go:334] "Generic (PLEG): container finished" podID="b35f87c4-e535-4901-8814-0b321b201158" containerID="a09830ab9c067f94a8fe072a6ed8e9195e12c6c572d7b1467cb8afc38542fb22" exitCode=137 Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.430005 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerDied","Data":"a09830ab9c067f94a8fe072a6ed8e9195e12c6c572d7b1467cb8afc38542fb22"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.430047 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76d88967b8-wmzcw" event={"ID":"b35f87c4-e535-4901-8814-0b321b201158","Type":"ContainerStarted","Data":"0790d289682a2d4589b5dad2459ef0ce236c592cc21a39f107819ffb2cf86603"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.430071 4907 scope.go:117] "RemoveContainer" containerID="c2b6ec3e96a2871e49421792b819e7d8811902b2acc4ebf5cb6213f4794ef38f" Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.438452 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" podStartSLOduration=1.438429547 podStartE2EDuration="1.438429547s" podCreationTimestamp="2026-02-26 16:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:09.437933546 +0000 UTC m=+1491.956495405" watchObservedRunningTime="2026-02-26 16:07:09.438429547 +0000 UTC m=+1491.956991396" Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.439839 4907 generic.go:334] "Generic (PLEG): container finished" podID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerID="f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd" exitCode=0 Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.439920 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" event={"ID":"9d796571-e8ef-42f7-bf25-96d758b8b32b","Type":"ContainerDied","Data":"f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.439949 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" event={"ID":"9d796571-e8ef-42f7-bf25-96d758b8b32b","Type":"ContainerStarted","Data":"e7c726956f42c01947a2cf697394976416b6538e6ccdb8386b1d31067bc48d5f"} Feb 26 16:07:09 crc kubenswrapper[4907]: I0226 16:07:09.442556 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe4d1a5-bb07-4b94-8e61-fb4602942978","Type":"ContainerStarted","Data":"acab2779c57bfdd69e6b7ccebc72e1c91aaaab4c092f29366939ef5738a747eb"} Feb 26 16:07:10 crc kubenswrapper[4907]: I0226 16:07:10.410951 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:10 crc kubenswrapper[4907]: I0226 16:07:10.430345 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:10 crc kubenswrapper[4907]: I0226 16:07:10.466222 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" event={"ID":"9d796571-e8ef-42f7-bf25-96d758b8b32b","Type":"ContainerStarted","Data":"e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927"} Feb 26 16:07:10 crc kubenswrapper[4907]: I0226 16:07:10.466630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:10 crc kubenswrapper[4907]: I0226 16:07:10.504375 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" podStartSLOduration=4.5043501070000005 podStartE2EDuration="4.504350107s" podCreationTimestamp="2026-02-26 16:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:10.495219483 +0000 UTC m=+1493.013781332" watchObservedRunningTime="2026-02-26 16:07:10.504350107 +0000 UTC m=+1493.022911956" Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.497405 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc33a845-b7a2-4d10-93a7-23f788917d59","Type":"ContainerStarted","Data":"8ce86f98a1d69388fb62a3c379e1b52de4a2e2c49b6a2ea15088086fb1fa04ae"} Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.499425 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc33a845-b7a2-4d10-93a7-23f788917d59","Type":"ContainerStarted","Data":"3410f43c13155ceb2981ac835807e838becfac6350df365251ec9a2e774891ee"} Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.499922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"640a222e-f92b-468e-bb3e-83de9bae97d4","Type":"ContainerStarted","Data":"beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588"} Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.508418 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4a6fb6f-387d-4765-8360-1570fa74a16e","Type":"ContainerStarted","Data":"11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2"} Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.508543 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a4a6fb6f-387d-4765-8360-1570fa74a16e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2" gracePeriod=30 Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.516781 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe4d1a5-bb07-4b94-8e61-fb4602942978","Type":"ContainerStarted","Data":"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff"} Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.516840 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe4d1a5-bb07-4b94-8e61-fb4602942978","Type":"ContainerStarted","Data":"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1"} Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.516993 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-log" containerID="cri-o://0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1" gracePeriod=30 Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.517317 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-metadata" containerID="cri-o://c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff" gracePeriod=30 Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.557018 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.07423538 podStartE2EDuration="7.556999302s" podCreationTimestamp="2026-02-26 16:07:06 +0000 UTC" firstStartedPulling="2026-02-26 16:07:08.224208277 +0000 UTC m=+1490.742770126" lastFinishedPulling="2026-02-26 16:07:12.706972199 +0000 UTC m=+1495.225534048" observedRunningTime="2026-02-26 16:07:13.544150737 +0000 UTC m=+1496.062712586" watchObservedRunningTime="2026-02-26 16:07:13.556999302 +0000 UTC m=+1496.075561151" Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.560704 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.742232841 podStartE2EDuration="7.560695163s" podCreationTimestamp="2026-02-26 16:07:06 +0000 UTC" firstStartedPulling="2026-02-26 16:07:07.894146425 +0000 UTC m=+1490.412708274" lastFinishedPulling="2026-02-26 16:07:12.712608747 +0000 UTC m=+1495.231170596" observedRunningTime="2026-02-26 16:07:13.524190458 +0000 UTC m=+1496.042752307" watchObservedRunningTime="2026-02-26 16:07:13.560695163 +0000 UTC m=+1496.079257012" Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.576410 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.292548386 podStartE2EDuration="7.576388937s" podCreationTimestamp="2026-02-26 16:07:06 +0000 UTC" firstStartedPulling="2026-02-26 16:07:08.423130208 +0000 UTC m=+1490.941692047" lastFinishedPulling="2026-02-26 16:07:12.706970749 +0000 UTC m=+1495.225532598" observedRunningTime="2026-02-26 16:07:13.56387952 +0000 UTC m=+1496.082441369" watchObservedRunningTime="2026-02-26 16:07:13.576388937 +0000 UTC m=+1496.094950776" Feb 26 16:07:13 crc kubenswrapper[4907]: I0226 16:07:13.592786 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.107765692 podStartE2EDuration="7.592763307s" podCreationTimestamp="2026-02-26 16:07:06 +0000 UTC" firstStartedPulling="2026-02-26 16:07:08.224729501 +0000 UTC m=+1490.743291360" lastFinishedPulling="2026-02-26 16:07:12.709727136 +0000 UTC m=+1495.228288975" observedRunningTime="2026-02-26 16:07:13.582637399 +0000 UTC m=+1496.101199268" watchObservedRunningTime="2026-02-26 16:07:13.592763307 +0000 UTC m=+1496.111325156" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.450255 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.527520 4907 generic.go:334] "Generic (PLEG): container finished" podID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerID="c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff" exitCode=0 Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.527559 4907 generic.go:334] "Generic (PLEG): container finished" podID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerID="0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1" exitCode=143 Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.528527 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe4d1a5-bb07-4b94-8e61-fb4602942978","Type":"ContainerDied","Data":"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff"} Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.528574 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe4d1a5-bb07-4b94-8e61-fb4602942978","Type":"ContainerDied","Data":"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1"} Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.528606 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe4d1a5-bb07-4b94-8e61-fb4602942978","Type":"ContainerDied","Data":"acab2779c57bfdd69e6b7ccebc72e1c91aaaab4c092f29366939ef5738a747eb"} Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.528625 4907 scope.go:117] "RemoveContainer" containerID="c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.528867 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.549707 4907 scope.go:117] "RemoveContainer" containerID="0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.568273 4907 scope.go:117] "RemoveContainer" containerID="c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff" Feb 26 16:07:14 crc kubenswrapper[4907]: E0226 16:07:14.568738 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff\": container with ID starting with c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff not found: ID does not exist" containerID="c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.568778 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff"} err="failed to get container status \"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff\": rpc error: code = NotFound desc = could not find container \"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff\": container with ID starting with c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff not found: ID does not exist" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.568804 4907 scope.go:117] "RemoveContainer" containerID="0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1" Feb 26 16:07:14 crc kubenswrapper[4907]: E0226 16:07:14.569124 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1\": container with ID starting with 0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1 not found: ID does not exist" containerID="0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.569160 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1"} err="failed to get container status \"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1\": rpc error: code = NotFound desc = could not find container \"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1\": container with ID starting with 0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1 not found: ID does not exist" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.569180 4907 scope.go:117] "RemoveContainer" containerID="c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.569503 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff"} err="failed to get container status \"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff\": rpc error: code = NotFound desc = could not find container \"c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff\": container with ID starting with c07743488b8a21e120c453295150af1e5df5a73a15b8ffdd967ae9d411224fff not found: ID does not exist" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.569548 4907 scope.go:117] "RemoveContainer" containerID="0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.569878 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1"} err="failed to get container status \"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1\": rpc error: code = NotFound desc = could not find container \"0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1\": container with ID starting with 0677f508eb3fd1352ee4b5d4c60b44d7e5b624361396714bf9907d5d16475ae1 not found: ID does not exist" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.572112 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-config-data\") pod \"afe4d1a5-bb07-4b94-8e61-fb4602942978\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.572223 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-combined-ca-bundle\") pod \"afe4d1a5-bb07-4b94-8e61-fb4602942978\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.572447 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-869tx\" (UniqueName: \"kubernetes.io/projected/afe4d1a5-bb07-4b94-8e61-fb4602942978-kube-api-access-869tx\") pod \"afe4d1a5-bb07-4b94-8e61-fb4602942978\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.572477 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4d1a5-bb07-4b94-8e61-fb4602942978-logs\") pod \"afe4d1a5-bb07-4b94-8e61-fb4602942978\" (UID: \"afe4d1a5-bb07-4b94-8e61-fb4602942978\") " Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.573246 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe4d1a5-bb07-4b94-8e61-fb4602942978-logs" (OuterVolumeSpecName: "logs") pod "afe4d1a5-bb07-4b94-8e61-fb4602942978" (UID: "afe4d1a5-bb07-4b94-8e61-fb4602942978"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.591900 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe4d1a5-bb07-4b94-8e61-fb4602942978-kube-api-access-869tx" (OuterVolumeSpecName: "kube-api-access-869tx") pod "afe4d1a5-bb07-4b94-8e61-fb4602942978" (UID: "afe4d1a5-bb07-4b94-8e61-fb4602942978"). InnerVolumeSpecName "kube-api-access-869tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.622744 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-config-data" (OuterVolumeSpecName: "config-data") pod "afe4d1a5-bb07-4b94-8e61-fb4602942978" (UID: "afe4d1a5-bb07-4b94-8e61-fb4602942978"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.630177 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afe4d1a5-bb07-4b94-8e61-fb4602942978" (UID: "afe4d1a5-bb07-4b94-8e61-fb4602942978"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.674967 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.675017 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe4d1a5-bb07-4b94-8e61-fb4602942978-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.675032 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-869tx\" (UniqueName: \"kubernetes.io/projected/afe4d1a5-bb07-4b94-8e61-fb4602942978-kube-api-access-869tx\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.675046 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe4d1a5-bb07-4b94-8e61-fb4602942978-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.861941 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.876569 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.895559 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:14 crc kubenswrapper[4907]: E0226 16:07:14.896098 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-metadata" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.896156 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-metadata" Feb 26 16:07:14 crc kubenswrapper[4907]: E0226 16:07:14.896239 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-log" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.896291 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-log" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.896815 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-metadata" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.896887 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" containerName="nova-metadata-log" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.897819 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.900336 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.900585 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.936510 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.984017 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.984359 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-config-data\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.984512 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033b981-89c7-485a-a29a-4dc76ee69f43-logs\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.984794 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqzgm\" (UniqueName: \"kubernetes.io/projected/d033b981-89c7-485a-a29a-4dc76ee69f43-kube-api-access-qqzgm\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:14 crc kubenswrapper[4907]: I0226 16:07:14.984927 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.087184 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqzgm\" (UniqueName: \"kubernetes.io/projected/d033b981-89c7-485a-a29a-4dc76ee69f43-kube-api-access-qqzgm\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.087246 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.087282 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.087350 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-config-data\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.087370 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033b981-89c7-485a-a29a-4dc76ee69f43-logs\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.101386 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.108537 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.109219 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033b981-89c7-485a-a29a-4dc76ee69f43-logs\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.110089 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqzgm\" (UniqueName: \"kubernetes.io/projected/d033b981-89c7-485a-a29a-4dc76ee69f43-kube-api-access-qqzgm\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.117914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-config-data\") pod \"nova-metadata-0\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.246964 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:15 crc kubenswrapper[4907]: W0226 16:07:15.756988 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd033b981_89c7_485a_a29a_4dc76ee69f43.slice/crio-90ab7b4ab9cefc46f4bcae74e99db069e70fca6a7d33ec7e14193f1b3945ec96 WatchSource:0}: Error finding container 90ab7b4ab9cefc46f4bcae74e99db069e70fca6a7d33ec7e14193f1b3945ec96: Status 404 returned error can't find the container with id 90ab7b4ab9cefc46f4bcae74e99db069e70fca6a7d33ec7e14193f1b3945ec96 Feb 26 16:07:15 crc kubenswrapper[4907]: I0226 16:07:15.766036 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:16 crc kubenswrapper[4907]: I0226 16:07:16.143244 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe4d1a5-bb07-4b94-8e61-fb4602942978" path="/var/lib/kubelet/pods/afe4d1a5-bb07-4b94-8e61-fb4602942978/volumes" Feb 26 16:07:16 crc kubenswrapper[4907]: I0226 16:07:16.567758 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d033b981-89c7-485a-a29a-4dc76ee69f43","Type":"ContainerStarted","Data":"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1"} Feb 26 16:07:16 crc kubenswrapper[4907]: I0226 16:07:16.567817 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d033b981-89c7-485a-a29a-4dc76ee69f43","Type":"ContainerStarted","Data":"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7"} Feb 26 16:07:16 crc kubenswrapper[4907]: I0226 16:07:16.567836 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d033b981-89c7-485a-a29a-4dc76ee69f43","Type":"ContainerStarted","Data":"90ab7b4ab9cefc46f4bcae74e99db069e70fca6a7d33ec7e14193f1b3945ec96"} Feb 26 16:07:16 crc kubenswrapper[4907]: I0226 16:07:16.597132 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.596998067 podStartE2EDuration="2.596998067s" podCreationTimestamp="2026-02-26 16:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:16.583330922 +0000 UTC m=+1499.101892771" watchObservedRunningTime="2026-02-26 16:07:16.596998067 +0000 UTC m=+1499.115559916" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.064157 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.064462 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.066165 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.066342 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.091187 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.106933 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.327772 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.413456 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-khgm9"] Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.413752 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerName="dnsmasq-dns" containerID="cri-o://7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256" gracePeriod=10 Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.628777 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.749080 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:07:17 crc kubenswrapper[4907]: I0226 16:07:17.750262 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.152909 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.153208 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.158746 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.190901 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.190978 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.574816 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.587643 4907 generic.go:334] "Generic (PLEG): container finished" podID="495e976f-e5c3-4fe4-9a08-12e01970b48d" containerID="fe36a4a7da824268ac681e7da109253b6e256198fd7296219b2d6cc025d3f6ab" exitCode=0 Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.587733 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6bjsz" event={"ID":"495e976f-e5c3-4fe4-9a08-12e01970b48d","Type":"ContainerDied","Data":"fe36a4a7da824268ac681e7da109253b6e256198fd7296219b2d6cc025d3f6ab"} Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.589825 4907 generic.go:334] "Generic (PLEG): container finished" podID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerID="7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256" exitCode=0 Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.590285 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.590332 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" event={"ID":"aa93ec18-05b5-4814-989a-ec50a85bba83","Type":"ContainerDied","Data":"7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256"} Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.590411 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-khgm9" event={"ID":"aa93ec18-05b5-4814-989a-ec50a85bba83","Type":"ContainerDied","Data":"b98a2e97db98747c096fa8cb29eef9699b999b504ca42b9b1797c7464f9a5c94"} Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.590437 4907 scope.go:117] "RemoveContainer" containerID="7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.626236 4907 scope.go:117] "RemoveContainer" containerID="d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.716222 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-nb\") pod \"aa93ec18-05b5-4814-989a-ec50a85bba83\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.716350 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-swift-storage-0\") pod \"aa93ec18-05b5-4814-989a-ec50a85bba83\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.716404 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-svc\") pod \"aa93ec18-05b5-4814-989a-ec50a85bba83\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.716429 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s42sx\" (UniqueName: \"kubernetes.io/projected/aa93ec18-05b5-4814-989a-ec50a85bba83-kube-api-access-s42sx\") pod \"aa93ec18-05b5-4814-989a-ec50a85bba83\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.716497 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-sb\") pod \"aa93ec18-05b5-4814-989a-ec50a85bba83\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.716518 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-config\") pod \"aa93ec18-05b5-4814-989a-ec50a85bba83\" (UID: \"aa93ec18-05b5-4814-989a-ec50a85bba83\") " Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.720729 4907 scope.go:117] "RemoveContainer" containerID="7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256" Feb 26 16:07:18 crc kubenswrapper[4907]: E0226 16:07:18.722638 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256\": container with ID starting with 7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256 not found: ID does not exist" containerID="7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.722664 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256"} err="failed to get container status \"7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256\": rpc error: code = NotFound desc = could not find container \"7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256\": container with ID starting with 7f740691bec51bde462269589c8cfd003c8166fcb69420f9e56b03bbec1c7256 not found: ID does not exist" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.722685 4907 scope.go:117] "RemoveContainer" containerID="d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c" Feb 26 16:07:18 crc kubenswrapper[4907]: E0226 16:07:18.732856 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c\": container with ID starting with d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c not found: ID does not exist" containerID="d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.732903 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c"} err="failed to get container status \"d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c\": rpc error: code = NotFound desc = could not find container \"d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c\": container with ID starting with d14253eb862cfde6eef00be8eaa22a5abbf3f781a7378501c6ee533869c01a6c not found: ID does not exist" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.775628 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa93ec18-05b5-4814-989a-ec50a85bba83-kube-api-access-s42sx" (OuterVolumeSpecName: "kube-api-access-s42sx") pod "aa93ec18-05b5-4814-989a-ec50a85bba83" (UID: "aa93ec18-05b5-4814-989a-ec50a85bba83"). InnerVolumeSpecName "kube-api-access-s42sx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.822889 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s42sx\" (UniqueName: \"kubernetes.io/projected/aa93ec18-05b5-4814-989a-ec50a85bba83-kube-api-access-s42sx\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.899932 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa93ec18-05b5-4814-989a-ec50a85bba83" (UID: "aa93ec18-05b5-4814-989a-ec50a85bba83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.903796 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-config" (OuterVolumeSpecName: "config") pod "aa93ec18-05b5-4814-989a-ec50a85bba83" (UID: "aa93ec18-05b5-4814-989a-ec50a85bba83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.930737 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.930770 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.963668 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa93ec18-05b5-4814-989a-ec50a85bba83" (UID: "aa93ec18-05b5-4814-989a-ec50a85bba83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.965334 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa93ec18-05b5-4814-989a-ec50a85bba83" (UID: "aa93ec18-05b5-4814-989a-ec50a85bba83"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:18 crc kubenswrapper[4907]: I0226 16:07:18.969692 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa93ec18-05b5-4814-989a-ec50a85bba83" (UID: "aa93ec18-05b5-4814-989a-ec50a85bba83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:19 crc kubenswrapper[4907]: I0226 16:07:19.031903 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:19 crc kubenswrapper[4907]: I0226 16:07:19.032083 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:19 crc kubenswrapper[4907]: I0226 16:07:19.032144 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa93ec18-05b5-4814-989a-ec50a85bba83-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:19 crc kubenswrapper[4907]: I0226 16:07:19.221147 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-khgm9"] Feb 26 16:07:19 crc kubenswrapper[4907]: I0226 16:07:19.233204 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-khgm9"] Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.017563 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.135886 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" path="/var/lib/kubelet/pods/aa93ec18-05b5-4814-989a-ec50a85bba83/volumes" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.155948 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-combined-ca-bundle\") pod \"495e976f-e5c3-4fe4-9a08-12e01970b48d\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.156177 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-config-data\") pod \"495e976f-e5c3-4fe4-9a08-12e01970b48d\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.156257 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-scripts\") pod \"495e976f-e5c3-4fe4-9a08-12e01970b48d\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.156376 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krjcf\" (UniqueName: \"kubernetes.io/projected/495e976f-e5c3-4fe4-9a08-12e01970b48d-kube-api-access-krjcf\") pod \"495e976f-e5c3-4fe4-9a08-12e01970b48d\" (UID: \"495e976f-e5c3-4fe4-9a08-12e01970b48d\") " Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.162280 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-scripts" (OuterVolumeSpecName: "scripts") pod "495e976f-e5c3-4fe4-9a08-12e01970b48d" (UID: "495e976f-e5c3-4fe4-9a08-12e01970b48d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.169277 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495e976f-e5c3-4fe4-9a08-12e01970b48d-kube-api-access-krjcf" (OuterVolumeSpecName: "kube-api-access-krjcf") pod "495e976f-e5c3-4fe4-9a08-12e01970b48d" (UID: "495e976f-e5c3-4fe4-9a08-12e01970b48d"). InnerVolumeSpecName "kube-api-access-krjcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.221277 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-config-data" (OuterVolumeSpecName: "config-data") pod "495e976f-e5c3-4fe4-9a08-12e01970b48d" (UID: "495e976f-e5c3-4fe4-9a08-12e01970b48d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.249028 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.249472 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.260703 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.260739 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.260753 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krjcf\" (UniqueName: \"kubernetes.io/projected/495e976f-e5c3-4fe4-9a08-12e01970b48d-kube-api-access-krjcf\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.280703 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "495e976f-e5c3-4fe4-9a08-12e01970b48d" (UID: "495e976f-e5c3-4fe4-9a08-12e01970b48d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.362422 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495e976f-e5c3-4fe4-9a08-12e01970b48d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.609715 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-6bjsz" event={"ID":"495e976f-e5c3-4fe4-9a08-12e01970b48d","Type":"ContainerDied","Data":"1e0b09de6070c353a9c4a642f8c3f32d92b14797e88ee73a084bfae081c1a537"} Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.610096 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0b09de6070c353a9c4a642f8c3f32d92b14797e88ee73a084bfae081c1a537" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.609777 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-6bjsz" Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.907723 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.908160 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-log" containerID="cri-o://3410f43c13155ceb2981ac835807e838becfac6350df365251ec9a2e774891ee" gracePeriod=30 Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.908310 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-api" containerID="cri-o://8ce86f98a1d69388fb62a3c379e1b52de4a2e2c49b6a2ea15088086fb1fa04ae" gracePeriod=30 Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.915910 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.916118 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="640a222e-f92b-468e-bb3e-83de9bae97d4" containerName="nova-scheduler-scheduler" containerID="cri-o://beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" gracePeriod=30 Feb 26 16:07:20 crc kubenswrapper[4907]: I0226 16:07:20.940816 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:21 crc kubenswrapper[4907]: I0226 16:07:21.627518 4907 generic.go:334] "Generic (PLEG): container finished" podID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerID="3410f43c13155ceb2981ac835807e838becfac6350df365251ec9a2e774891ee" exitCode=143 Feb 26 16:07:21 crc kubenswrapper[4907]: I0226 16:07:21.628333 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-log" containerID="cri-o://9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7" gracePeriod=30 Feb 26 16:07:21 crc kubenswrapper[4907]: I0226 16:07:21.628055 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc33a845-b7a2-4d10-93a7-23f788917d59","Type":"ContainerDied","Data":"3410f43c13155ceb2981ac835807e838becfac6350df365251ec9a2e774891ee"} Feb 26 16:07:21 crc kubenswrapper[4907]: I0226 16:07:21.628928 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-metadata" containerID="cri-o://f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1" gracePeriod=30 Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.075754 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.079006 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.080634 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.080784 4907 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="640a222e-f92b-468e-bb3e-83de9bae97d4" containerName="nova-scheduler-scheduler" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.354004 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.518078 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033b981-89c7-485a-a29a-4dc76ee69f43-logs\") pod \"d033b981-89c7-485a-a29a-4dc76ee69f43\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.518160 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-combined-ca-bundle\") pod \"d033b981-89c7-485a-a29a-4dc76ee69f43\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.518531 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d033b981-89c7-485a-a29a-4dc76ee69f43-logs" (OuterVolumeSpecName: "logs") pod "d033b981-89c7-485a-a29a-4dc76ee69f43" (UID: "d033b981-89c7-485a-a29a-4dc76ee69f43"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.518628 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-nova-metadata-tls-certs\") pod \"d033b981-89c7-485a-a29a-4dc76ee69f43\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.518855 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-config-data\") pod \"d033b981-89c7-485a-a29a-4dc76ee69f43\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.519401 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqzgm\" (UniqueName: \"kubernetes.io/projected/d033b981-89c7-485a-a29a-4dc76ee69f43-kube-api-access-qqzgm\") pod \"d033b981-89c7-485a-a29a-4dc76ee69f43\" (UID: \"d033b981-89c7-485a-a29a-4dc76ee69f43\") " Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.521109 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d033b981-89c7-485a-a29a-4dc76ee69f43-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.541889 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d033b981-89c7-485a-a29a-4dc76ee69f43-kube-api-access-qqzgm" (OuterVolumeSpecName: "kube-api-access-qqzgm") pod "d033b981-89c7-485a-a29a-4dc76ee69f43" (UID: "d033b981-89c7-485a-a29a-4dc76ee69f43"). InnerVolumeSpecName "kube-api-access-qqzgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.556030 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-config-data" (OuterVolumeSpecName: "config-data") pod "d033b981-89c7-485a-a29a-4dc76ee69f43" (UID: "d033b981-89c7-485a-a29a-4dc76ee69f43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.580766 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d033b981-89c7-485a-a29a-4dc76ee69f43" (UID: "d033b981-89c7-485a-a29a-4dc76ee69f43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.595862 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d033b981-89c7-485a-a29a-4dc76ee69f43" (UID: "d033b981-89c7-485a-a29a-4dc76ee69f43"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.622848 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.622879 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.622889 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d033b981-89c7-485a-a29a-4dc76ee69f43-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.622897 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqzgm\" (UniqueName: \"kubernetes.io/projected/d033b981-89c7-485a-a29a-4dc76ee69f43-kube-api-access-qqzgm\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.639922 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.639886 4907 generic.go:334] "Generic (PLEG): container finished" podID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerID="f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1" exitCode=0 Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.639958 4907 generic.go:334] "Generic (PLEG): container finished" podID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerID="9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7" exitCode=143 Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.639922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d033b981-89c7-485a-a29a-4dc76ee69f43","Type":"ContainerDied","Data":"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1"} Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.640008 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d033b981-89c7-485a-a29a-4dc76ee69f43","Type":"ContainerDied","Data":"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7"} Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.640020 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d033b981-89c7-485a-a29a-4dc76ee69f43","Type":"ContainerDied","Data":"90ab7b4ab9cefc46f4bcae74e99db069e70fca6a7d33ec7e14193f1b3945ec96"} Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.640035 4907 scope.go:117] "RemoveContainer" containerID="f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.690668 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.708445 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.709543 4907 scope.go:117] "RemoveContainer" containerID="9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729149 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.729538 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-metadata" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729555 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-metadata" Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.729573 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerName="init" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729579 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerName="init" Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.729611 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-log" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729618 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-log" Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.729631 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495e976f-e5c3-4fe4-9a08-12e01970b48d" containerName="nova-manage" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729636 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="495e976f-e5c3-4fe4-9a08-12e01970b48d" containerName="nova-manage" Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.729646 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerName="dnsmasq-dns" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729651 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerName="dnsmasq-dns" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729812 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-log" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729825 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" containerName="nova-metadata-metadata" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729836 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa93ec18-05b5-4814-989a-ec50a85bba83" containerName="dnsmasq-dns" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.729845 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="495e976f-e5c3-4fe4-9a08-12e01970b48d" containerName="nova-manage" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.730760 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.732796 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.734583 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.752133 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.797775 4907 scope.go:117] "RemoveContainer" containerID="f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1" Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.798377 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1\": container with ID starting with f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1 not found: ID does not exist" containerID="f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.798464 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1"} err="failed to get container status \"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1\": rpc error: code = NotFound desc = could not find container \"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1\": container with ID starting with f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1 not found: ID does not exist" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.798532 4907 scope.go:117] "RemoveContainer" containerID="9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7" Feb 26 16:07:22 crc kubenswrapper[4907]: E0226 16:07:22.799074 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7\": container with ID starting with 9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7 not found: ID does not exist" containerID="9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.799121 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7"} err="failed to get container status \"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7\": rpc error: code = NotFound desc = could not find container \"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7\": container with ID starting with 9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7 not found: ID does not exist" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.799141 4907 scope.go:117] "RemoveContainer" containerID="f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.799953 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1"} err="failed to get container status \"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1\": rpc error: code = NotFound desc = could not find container \"f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1\": container with ID starting with f52ceeba1e8cb335e0b60ab307d3eff9143e5d1b0572dfb26d405d1b0dc1edb1 not found: ID does not exist" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.799973 4907 scope.go:117] "RemoveContainer" containerID="9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.800219 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7"} err="failed to get container status \"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7\": rpc error: code = NotFound desc = could not find container \"9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7\": container with ID starting with 9ac98c8302673bb5d2e10a6e79f982a217a74c7a50a94ff32bbc497d26ea6bf7 not found: ID does not exist" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.826506 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.827063 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bfed0ca-af76-4ba2-8be4-84716902175b-logs\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.827105 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwvf9\" (UniqueName: \"kubernetes.io/projected/3bfed0ca-af76-4ba2-8be4-84716902175b-kube-api-access-gwvf9\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.827179 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.827916 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-config-data\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.930055 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bfed0ca-af76-4ba2-8be4-84716902175b-logs\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.930266 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwvf9\" (UniqueName: \"kubernetes.io/projected/3bfed0ca-af76-4ba2-8be4-84716902175b-kube-api-access-gwvf9\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.930394 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.930495 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-config-data\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.930558 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bfed0ca-af76-4ba2-8be4-84716902175b-logs\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.930574 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.935778 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-config-data\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.936065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.936452 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:22 crc kubenswrapper[4907]: I0226 16:07:22.948968 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwvf9\" (UniqueName: \"kubernetes.io/projected/3bfed0ca-af76-4ba2-8be4-84716902175b-kube-api-access-gwvf9\") pod \"nova-metadata-0\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " pod="openstack/nova-metadata-0" Feb 26 16:07:23 crc kubenswrapper[4907]: I0226 16:07:23.119988 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:07:23 crc kubenswrapper[4907]: I0226 16:07:23.373953 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 16:07:23 crc kubenswrapper[4907]: I0226 16:07:23.574520 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:07:23 crc kubenswrapper[4907]: I0226 16:07:23.657492 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bfed0ca-af76-4ba2-8be4-84716902175b","Type":"ContainerStarted","Data":"81f061cda32d1af272950b98a178ad698e358a6c9bffed3c34c646d2cfbbbe68"} Feb 26 16:07:23 crc kubenswrapper[4907]: I0226 16:07:23.661024 4907 generic.go:334] "Generic (PLEG): container finished" podID="e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" containerID="c3b1f7c4476fdadc4251d9974969d30ed70f9f124e466d21df11ac454b81ccc0" exitCode=0 Feb 26 16:07:23 crc kubenswrapper[4907]: I0226 16:07:23.661073 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" event={"ID":"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2","Type":"ContainerDied","Data":"c3b1f7c4476fdadc4251d9974969d30ed70f9f124e466d21df11ac454b81ccc0"} Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.138040 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d033b981-89c7-485a-a29a-4dc76ee69f43" path="/var/lib/kubelet/pods/d033b981-89c7-485a-a29a-4dc76ee69f43/volumes" Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.684910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bfed0ca-af76-4ba2-8be4-84716902175b","Type":"ContainerStarted","Data":"d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a"} Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.685274 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bfed0ca-af76-4ba2-8be4-84716902175b","Type":"ContainerStarted","Data":"230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245"} Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.709205 4907 generic.go:334] "Generic (PLEG): container finished" podID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerID="8ce86f98a1d69388fb62a3c379e1b52de4a2e2c49b6a2ea15088086fb1fa04ae" exitCode=0 Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.709418 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc33a845-b7a2-4d10-93a7-23f788917d59","Type":"ContainerDied","Data":"8ce86f98a1d69388fb62a3c379e1b52de4a2e2c49b6a2ea15088086fb1fa04ae"} Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.715010 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.714990527 podStartE2EDuration="2.714990527s" podCreationTimestamp="2026-02-26 16:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:24.706001108 +0000 UTC m=+1507.224562967" watchObservedRunningTime="2026-02-26 16:07:24.714990527 +0000 UTC m=+1507.233552376" Feb 26 16:07:24 crc kubenswrapper[4907]: I0226 16:07:24.905193 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.074803 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc33a845-b7a2-4d10-93a7-23f788917d59-logs\") pod \"dc33a845-b7a2-4d10-93a7-23f788917d59\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.075019 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f2sm\" (UniqueName: \"kubernetes.io/projected/dc33a845-b7a2-4d10-93a7-23f788917d59-kube-api-access-2f2sm\") pod \"dc33a845-b7a2-4d10-93a7-23f788917d59\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.075136 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-combined-ca-bundle\") pod \"dc33a845-b7a2-4d10-93a7-23f788917d59\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.075203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-config-data\") pod \"dc33a845-b7a2-4d10-93a7-23f788917d59\" (UID: \"dc33a845-b7a2-4d10-93a7-23f788917d59\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.076115 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc33a845-b7a2-4d10-93a7-23f788917d59-logs" (OuterVolumeSpecName: "logs") pod "dc33a845-b7a2-4d10-93a7-23f788917d59" (UID: "dc33a845-b7a2-4d10-93a7-23f788917d59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.077254 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc33a845-b7a2-4d10-93a7-23f788917d59-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.089838 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc33a845-b7a2-4d10-93a7-23f788917d59-kube-api-access-2f2sm" (OuterVolumeSpecName: "kube-api-access-2f2sm") pod "dc33a845-b7a2-4d10-93a7-23f788917d59" (UID: "dc33a845-b7a2-4d10-93a7-23f788917d59"). InnerVolumeSpecName "kube-api-access-2f2sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.105824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-config-data" (OuterVolumeSpecName: "config-data") pod "dc33a845-b7a2-4d10-93a7-23f788917d59" (UID: "dc33a845-b7a2-4d10-93a7-23f788917d59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.106004 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc33a845-b7a2-4d10-93a7-23f788917d59" (UID: "dc33a845-b7a2-4d10-93a7-23f788917d59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.178750 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.179000 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f2sm\" (UniqueName: \"kubernetes.io/projected/dc33a845-b7a2-4d10-93a7-23f788917d59-kube-api-access-2f2sm\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.179011 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc33a845-b7a2-4d10-93a7-23f788917d59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.180335 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.280185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-combined-ca-bundle\") pod \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.280446 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-config-data\") pod \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.280485 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-scripts\") pod \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.280550 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj2xx\" (UniqueName: \"kubernetes.io/projected/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-kube-api-access-mj2xx\") pod \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\" (UID: \"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.283887 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-scripts" (OuterVolumeSpecName: "scripts") pod "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" (UID: "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.293951 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-kube-api-access-mj2xx" (OuterVolumeSpecName: "kube-api-access-mj2xx") pod "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" (UID: "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2"). InnerVolumeSpecName "kube-api-access-mj2xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.313301 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" (UID: "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.317362 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-config-data" (OuterVolumeSpecName: "config-data") pod "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" (UID: "e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.382622 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.382660 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.382672 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj2xx\" (UniqueName: \"kubernetes.io/projected/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-kube-api-access-mj2xx\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.382684 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.538748 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.687770 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bksrm\" (UniqueName: \"kubernetes.io/projected/640a222e-f92b-468e-bb3e-83de9bae97d4-kube-api-access-bksrm\") pod \"640a222e-f92b-468e-bb3e-83de9bae97d4\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.687832 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-config-data\") pod \"640a222e-f92b-468e-bb3e-83de9bae97d4\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.687908 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-combined-ca-bundle\") pod \"640a222e-f92b-468e-bb3e-83de9bae97d4\" (UID: \"640a222e-f92b-468e-bb3e-83de9bae97d4\") " Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.695177 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/640a222e-f92b-468e-bb3e-83de9bae97d4-kube-api-access-bksrm" (OuterVolumeSpecName: "kube-api-access-bksrm") pod "640a222e-f92b-468e-bb3e-83de9bae97d4" (UID: "640a222e-f92b-468e-bb3e-83de9bae97d4"). InnerVolumeSpecName "kube-api-access-bksrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.747183 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "640a222e-f92b-468e-bb3e-83de9bae97d4" (UID: "640a222e-f92b-468e-bb3e-83de9bae97d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.752147 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.753397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-l2rsl" event={"ID":"e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2","Type":"ContainerDied","Data":"92dd714eaeef7bb8bb57879c4401ab45f6c136d0363d7c1819ac5880ac8af2da"} Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.753440 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92dd714eaeef7bb8bb57879c4401ab45f6c136d0363d7c1819ac5880ac8af2da" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.770363 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc33a845-b7a2-4d10-93a7-23f788917d59","Type":"ContainerDied","Data":"3252640550de7702d53f928157cf520bf9d1b8ae332a5e9486116155d80a7dc6"} Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.770422 4907 scope.go:117] "RemoveContainer" containerID="8ce86f98a1d69388fb62a3c379e1b52de4a2e2c49b6a2ea15088086fb1fa04ae" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.770873 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.771347 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-config-data" (OuterVolumeSpecName: "config-data") pod "640a222e-f92b-468e-bb3e-83de9bae97d4" (UID: "640a222e-f92b-468e-bb3e-83de9bae97d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.775746 4907 generic.go:334] "Generic (PLEG): container finished" podID="640a222e-f92b-468e-bb3e-83de9bae97d4" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" exitCode=0 Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.776784 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.778353 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"640a222e-f92b-468e-bb3e-83de9bae97d4","Type":"ContainerDied","Data":"beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588"} Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.778426 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"640a222e-f92b-468e-bb3e-83de9bae97d4","Type":"ContainerDied","Data":"2c8be294b62b65221d7f2aaa5f5bc97e9db87135b4c67fa553704b9337907d32"} Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.789761 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bksrm\" (UniqueName: \"kubernetes.io/projected/640a222e-f92b-468e-bb3e-83de9bae97d4-kube-api-access-bksrm\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.789795 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.789806 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/640a222e-f92b-468e-bb3e-83de9bae97d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.798726 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: E0226 16:07:25.799178 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-log" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799201 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-log" Feb 26 16:07:25 crc kubenswrapper[4907]: E0226 16:07:25.799230 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" containerName="nova-cell1-conductor-db-sync" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799238 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" containerName="nova-cell1-conductor-db-sync" Feb 26 16:07:25 crc kubenswrapper[4907]: E0226 16:07:25.799253 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="640a222e-f92b-468e-bb3e-83de9bae97d4" containerName="nova-scheduler-scheduler" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799260 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="640a222e-f92b-468e-bb3e-83de9bae97d4" containerName="nova-scheduler-scheduler" Feb 26 16:07:25 crc kubenswrapper[4907]: E0226 16:07:25.799289 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-api" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799297 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-api" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799567 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-api" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799578 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="640a222e-f92b-468e-bb3e-83de9bae97d4" containerName="nova-scheduler-scheduler" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799612 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" containerName="nova-cell1-conductor-db-sync" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.799639 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" containerName="nova-api-log" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.800458 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.806162 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.846098 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.881670 4907 scope.go:117] "RemoveContainer" containerID="3410f43c13155ceb2981ac835807e838becfac6350df365251ec9a2e774891ee" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.921320 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.934804 4907 scope.go:117] "RemoveContainer" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.942071 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.957808 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.967701 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.969456 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.986526 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.994157 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.995268 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a123e787-8e80-495d-86f2-717a9c43353c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.995390 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8l6b\" (UniqueName: \"kubernetes.io/projected/a123e787-8e80-495d-86f2-717a9c43353c-kube-api-access-k8l6b\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:25 crc kubenswrapper[4907]: I0226 16:07:25.995499 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a123e787-8e80-495d-86f2-717a9c43353c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.014894 4907 scope.go:117] "RemoveContainer" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" Feb 26 16:07:26 crc kubenswrapper[4907]: E0226 16:07:26.020221 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588\": container with ID starting with beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588 not found: ID does not exist" containerID="beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.020278 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588"} err="failed to get container status \"beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588\": rpc error: code = NotFound desc = could not find container \"beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588\": container with ID starting with beb456441b61dff5a9777e674ba40f7ddd6b823185de13da3ea816a528968588 not found: ID does not exist" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.034918 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.051830 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.058276 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.062363 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.079852 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.096861 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-config-data\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.096910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.096966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a123e787-8e80-495d-86f2-717a9c43353c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.097086 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8l6b\" (UniqueName: \"kubernetes.io/projected/a123e787-8e80-495d-86f2-717a9c43353c-kube-api-access-k8l6b\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.097158 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a123e787-8e80-495d-86f2-717a9c43353c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.097227 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5z5\" (UniqueName: \"kubernetes.io/projected/162c5aed-9a98-49ed-a628-efc7c67b82a4-kube-api-access-bt5z5\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.117463 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a123e787-8e80-495d-86f2-717a9c43353c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.128424 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a123e787-8e80-495d-86f2-717a9c43353c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.136928 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8l6b\" (UniqueName: \"kubernetes.io/projected/a123e787-8e80-495d-86f2-717a9c43353c-kube-api-access-k8l6b\") pod \"nova-cell1-conductor-0\" (UID: \"a123e787-8e80-495d-86f2-717a9c43353c\") " pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.173831 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="640a222e-f92b-468e-bb3e-83de9bae97d4" path="/var/lib/kubelet/pods/640a222e-f92b-468e-bb3e-83de9bae97d4/volumes" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.174791 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc33a845-b7a2-4d10-93a7-23f788917d59" path="/var/lib/kubelet/pods/dc33a845-b7a2-4d10-93a7-23f788917d59/volumes" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.198833 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.198898 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5z5\" (UniqueName: \"kubernetes.io/projected/162c5aed-9a98-49ed-a628-efc7c67b82a4-kube-api-access-bt5z5\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.198918 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kqv9\" (UniqueName: \"kubernetes.io/projected/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-kube-api-access-9kqv9\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.198964 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-config-data\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.198979 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.199016 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-config-data\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.199053 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-logs\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.205540 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.207202 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-config-data\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.214401 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.218293 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5z5\" (UniqueName: \"kubernetes.io/projected/162c5aed-9a98-49ed-a628-efc7c67b82a4-kube-api-access-bt5z5\") pod \"nova-scheduler-0\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.301212 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.301636 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kqv9\" (UniqueName: \"kubernetes.io/projected/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-kube-api-access-9kqv9\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.301738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-config-data\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.301827 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-logs\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.302203 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-logs\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.305214 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.306291 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-config-data\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.314083 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.323406 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kqv9\" (UniqueName: \"kubernetes.io/projected/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-kube-api-access-9kqv9\") pod \"nova-api-0\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.382307 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.812332 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 16:07:26 crc kubenswrapper[4907]: W0226 16:07:26.812442 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda123e787_8e80_495d_86f2_717a9c43353c.slice/crio-6cdcc2554cce40a6ae5c5053cac74f40299822e101c06102651027cc0e5cdd4c WatchSource:0}: Error finding container 6cdcc2554cce40a6ae5c5053cac74f40299822e101c06102651027cc0e5cdd4c: Status 404 returned error can't find the container with id 6cdcc2554cce40a6ae5c5053cac74f40299822e101c06102651027cc0e5cdd4c Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.943037 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:26 crc kubenswrapper[4907]: I0226 16:07:26.986501 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.750368 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.799372 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a123e787-8e80-495d-86f2-717a9c43353c","Type":"ContainerStarted","Data":"54b51a308cc4899f791c1fe33262be12f38e9d7d0a6a785470638298cde3e1c1"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.799420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a123e787-8e80-495d-86f2-717a9c43353c","Type":"ContainerStarted","Data":"6cdcc2554cce40a6ae5c5053cac74f40299822e101c06102651027cc0e5cdd4c"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.799467 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.801701 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"162c5aed-9a98-49ed-a628-efc7c67b82a4","Type":"ContainerStarted","Data":"52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.801734 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"162c5aed-9a98-49ed-a628-efc7c67b82a4","Type":"ContainerStarted","Data":"46e008a70d3de709d410ef6bc0acbcd17f73ac5fd21ce54e7dd6fc3d2b5088e8"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.804211 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887","Type":"ContainerStarted","Data":"8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.804245 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887","Type":"ContainerStarted","Data":"6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.804260 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887","Type":"ContainerStarted","Data":"d0d59c7247ddc9cd23b73f76b6f17ceb39c24defd837b8fe2f10ba40ce05c641"} Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.815183 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.815166455 podStartE2EDuration="2.815166455s" podCreationTimestamp="2026-02-26 16:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:27.813973486 +0000 UTC m=+1510.332535335" watchObservedRunningTime="2026-02-26 16:07:27.815166455 +0000 UTC m=+1510.333728304" Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.835296 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.835279688 podStartE2EDuration="2.835279688s" podCreationTimestamp="2026-02-26 16:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:27.831402183 +0000 UTC m=+1510.349964032" watchObservedRunningTime="2026-02-26 16:07:27.835279688 +0000 UTC m=+1510.353841537" Feb 26 16:07:27 crc kubenswrapper[4907]: I0226 16:07:27.855061 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.855043222 podStartE2EDuration="2.855043222s" podCreationTimestamp="2026-02-26 16:07:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:27.851665599 +0000 UTC m=+1510.370227448" watchObservedRunningTime="2026-02-26 16:07:27.855043222 +0000 UTC m=+1510.373605071" Feb 26 16:07:28 crc kubenswrapper[4907]: I0226 16:07:28.121063 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:07:28 crc kubenswrapper[4907]: I0226 16:07:28.121375 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:07:28 crc kubenswrapper[4907]: I0226 16:07:28.155046 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76d88967b8-wmzcw" podUID="b35f87c4-e535-4901-8814-0b321b201158" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.222705 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.223123 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="3623ea59-40fb-48f6-943d-1ea5fe3ad253" containerName="kube-state-metrics" containerID="cri-o://c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef" gracePeriod=30 Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.787497 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.789950 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp248\" (UniqueName: \"kubernetes.io/projected/3623ea59-40fb-48f6-943d-1ea5fe3ad253-kube-api-access-zp248\") pod \"3623ea59-40fb-48f6-943d-1ea5fe3ad253\" (UID: \"3623ea59-40fb-48f6-943d-1ea5fe3ad253\") " Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.798414 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3623ea59-40fb-48f6-943d-1ea5fe3ad253-kube-api-access-zp248" (OuterVolumeSpecName: "kube-api-access-zp248") pod "3623ea59-40fb-48f6-943d-1ea5fe3ad253" (UID: "3623ea59-40fb-48f6-943d-1ea5fe3ad253"). InnerVolumeSpecName "kube-api-access-zp248". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.851002 4907 generic.go:334] "Generic (PLEG): container finished" podID="3623ea59-40fb-48f6-943d-1ea5fe3ad253" containerID="c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef" exitCode=2 Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.851085 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3623ea59-40fb-48f6-943d-1ea5fe3ad253","Type":"ContainerDied","Data":"c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef"} Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.851146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"3623ea59-40fb-48f6-943d-1ea5fe3ad253","Type":"ContainerDied","Data":"6be5f9675dde560a3293f4798ed38ecd5c23502d491fb1ae059567b705653f3c"} Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.851217 4907 scope.go:117] "RemoveContainer" containerID="c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.851503 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.913077 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp248\" (UniqueName: \"kubernetes.io/projected/3623ea59-40fb-48f6-943d-1ea5fe3ad253-kube-api-access-zp248\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.920823 4907 scope.go:117] "RemoveContainer" containerID="c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.921872 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:29 crc kubenswrapper[4907]: E0226 16:07:29.921913 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef\": container with ID starting with c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef not found: ID does not exist" containerID="c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.921955 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef"} err="failed to get container status \"c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef\": rpc error: code = NotFound desc = could not find container \"c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef\": container with ID starting with c2645f300acb4d621d5efc3ab5b644ad14a8657511b9aad9f6b2ce977f97baef not found: ID does not exist" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.934527 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.944315 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:29 crc kubenswrapper[4907]: E0226 16:07:29.945264 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3623ea59-40fb-48f6-943d-1ea5fe3ad253" containerName="kube-state-metrics" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.945284 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3623ea59-40fb-48f6-943d-1ea5fe3ad253" containerName="kube-state-metrics" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.945826 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3623ea59-40fb-48f6-943d-1ea5fe3ad253" containerName="kube-state-metrics" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.946429 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.949978 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.951379 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:29 crc kubenswrapper[4907]: I0226 16:07:29.952021 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.014210 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdhg\" (UniqueName: \"kubernetes.io/projected/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-api-access-npdhg\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.014278 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.014299 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.014329 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.115736 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdhg\" (UniqueName: \"kubernetes.io/projected/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-api-access-npdhg\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.115818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.115842 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.115884 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.137708 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdhg\" (UniqueName: \"kubernetes.io/projected/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-api-access-npdhg\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.137756 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.138974 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.139577 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f7394cd4-d14c-450e-8865-7c7509c5021b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f7394cd4-d14c-450e-8865-7c7509c5021b\") " pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.147414 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3623ea59-40fb-48f6-943d-1ea5fe3ad253" path="/var/lib/kubelet/pods/3623ea59-40fb-48f6-943d-1ea5fe3ad253/volumes" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.270399 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.766842 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.776865 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:07:30 crc kubenswrapper[4907]: I0226 16:07:30.865118 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f7394cd4-d14c-450e-8865-7c7509c5021b","Type":"ContainerStarted","Data":"a3cb0fecfb78fa1677a57a07cef43cd3f9d00096ed6e47a8c27f81ec32c301e5"} Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.315140 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.598495 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.598980 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="proxy-httpd" containerID="cri-o://263981f6f729e5e0643daa3a3a6cdb7958182b643e5936300173d35ab977bb96" gracePeriod=30 Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.599061 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-notification-agent" containerID="cri-o://fb844999d61fcc35ddd9efca649f4b958c4bffe574f9a722d14b83e7486203ab" gracePeriod=30 Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.599041 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="sg-core" containerID="cri-o://c857c5f82d8c70c13bf2c5eb1d3d2caed95365a3f1d0ed75a49e357e7fcb708c" gracePeriod=30 Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.598955 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-central-agent" containerID="cri-o://9220e6ca6e5e2e1f96dc9d7844d9da87559289e407da3cae6f6f0dcf53af0469" gracePeriod=30 Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.875122 4907 generic.go:334] "Generic (PLEG): container finished" podID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerID="263981f6f729e5e0643daa3a3a6cdb7958182b643e5936300173d35ab977bb96" exitCode=0 Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.875425 4907 generic.go:334] "Generic (PLEG): container finished" podID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerID="c857c5f82d8c70c13bf2c5eb1d3d2caed95365a3f1d0ed75a49e357e7fcb708c" exitCode=2 Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.875183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerDied","Data":"263981f6f729e5e0643daa3a3a6cdb7958182b643e5936300173d35ab977bb96"} Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.875486 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerDied","Data":"c857c5f82d8c70c13bf2c5eb1d3d2caed95365a3f1d0ed75a49e357e7fcb708c"} Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.877347 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f7394cd4-d14c-450e-8865-7c7509c5021b","Type":"ContainerStarted","Data":"976b83090d0bd5588b7a2f1623a989b8655b1b1b8d6f685413d61384ac025a4a"} Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.877474 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 16:07:31 crc kubenswrapper[4907]: I0226 16:07:31.902914 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.484714796 podStartE2EDuration="2.902896635s" podCreationTimestamp="2026-02-26 16:07:29 +0000 UTC" firstStartedPulling="2026-02-26 16:07:30.776656178 +0000 UTC m=+1513.295218027" lastFinishedPulling="2026-02-26 16:07:31.194838017 +0000 UTC m=+1513.713399866" observedRunningTime="2026-02-26 16:07:31.893858503 +0000 UTC m=+1514.412420352" watchObservedRunningTime="2026-02-26 16:07:31.902896635 +0000 UTC m=+1514.421458484" Feb 26 16:07:32 crc kubenswrapper[4907]: I0226 16:07:32.890177 4907 generic.go:334] "Generic (PLEG): container finished" podID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerID="fb844999d61fcc35ddd9efca649f4b958c4bffe574f9a722d14b83e7486203ab" exitCode=0 Feb 26 16:07:32 crc kubenswrapper[4907]: I0226 16:07:32.890497 4907 generic.go:334] "Generic (PLEG): container finished" podID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerID="9220e6ca6e5e2e1f96dc9d7844d9da87559289e407da3cae6f6f0dcf53af0469" exitCode=0 Feb 26 16:07:32 crc kubenswrapper[4907]: I0226 16:07:32.890265 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerDied","Data":"fb844999d61fcc35ddd9efca649f4b958c4bffe574f9a722d14b83e7486203ab"} Feb 26 16:07:32 crc kubenswrapper[4907]: I0226 16:07:32.890569 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerDied","Data":"9220e6ca6e5e2e1f96dc9d7844d9da87559289e407da3cae6f6f0dcf53af0469"} Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.121068 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.121117 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.208314 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278220 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-config-data\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278265 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-log-httpd\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278291 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-scripts\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278352 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-combined-ca-bundle\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278376 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-run-httpd\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278390 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-sg-core-conf-yaml\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.278465 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9g89\" (UniqueName: \"kubernetes.io/projected/614d4398-61e7-4159-bcf2-a75e8c2c91fb-kube-api-access-j9g89\") pod \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\" (UID: \"614d4398-61e7-4159-bcf2-a75e8c2c91fb\") " Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.279144 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.279366 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.279655 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.285340 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-scripts" (OuterVolumeSpecName: "scripts") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.308838 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/614d4398-61e7-4159-bcf2-a75e8c2c91fb-kube-api-access-j9g89" (OuterVolumeSpecName: "kube-api-access-j9g89") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "kube-api-access-j9g89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.380888 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9g89\" (UniqueName: \"kubernetes.io/projected/614d4398-61e7-4159-bcf2-a75e8c2c91fb-kube-api-access-j9g89\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.381168 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/614d4398-61e7-4159-bcf2-a75e8c2c91fb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.381272 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.396790 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.413265 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.471218 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-config-data" (OuterVolumeSpecName: "config-data") pod "614d4398-61e7-4159-bcf2-a75e8c2c91fb" (UID: "614d4398-61e7-4159-bcf2-a75e8c2c91fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.484096 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.484140 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.484152 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/614d4398-61e7-4159-bcf2-a75e8c2c91fb-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.901270 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"614d4398-61e7-4159-bcf2-a75e8c2c91fb","Type":"ContainerDied","Data":"4d66ce7f890508fb72c7a79a35a2b2720ae4f7bb0d238b1256d1366905f7ed00"} Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.901344 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.901657 4907 scope.go:117] "RemoveContainer" containerID="263981f6f729e5e0643daa3a3a6cdb7958182b643e5936300173d35ab977bb96" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.932644 4907 scope.go:117] "RemoveContainer" containerID="c857c5f82d8c70c13bf2c5eb1d3d2caed95365a3f1d0ed75a49e357e7fcb708c" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.956243 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.966686 4907 scope.go:117] "RemoveContainer" containerID="fb844999d61fcc35ddd9efca649f4b958c4bffe574f9a722d14b83e7486203ab" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.979643 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.989653 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:33 crc kubenswrapper[4907]: E0226 16:07:33.990119 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-central-agent" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990141 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-central-agent" Feb 26 16:07:33 crc kubenswrapper[4907]: E0226 16:07:33.990168 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-notification-agent" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990177 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-notification-agent" Feb 26 16:07:33 crc kubenswrapper[4907]: E0226 16:07:33.990195 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="proxy-httpd" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990203 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="proxy-httpd" Feb 26 16:07:33 crc kubenswrapper[4907]: E0226 16:07:33.990228 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="sg-core" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990237 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="sg-core" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990400 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="proxy-httpd" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990431 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-central-agent" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990441 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="ceilometer-notification-agent" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.990453 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" containerName="sg-core" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.992056 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.994434 4907 scope.go:117] "RemoveContainer" containerID="9220e6ca6e5e2e1f96dc9d7844d9da87559289e407da3cae6f6f0dcf53af0469" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.998067 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.998185 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 16:07:33 crc kubenswrapper[4907]: I0226 16:07:33.998395 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.006568 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098080 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098150 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-run-httpd\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098292 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-log-httpd\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098370 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098450 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-config-data\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098508 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvs86\" (UniqueName: \"kubernetes.io/projected/8cf1cab1-8021-4ea6-9688-93259f692624-kube-api-access-bvs86\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.098535 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-scripts\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.136739 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="614d4398-61e7-4159-bcf2-a75e8c2c91fb" path="/var/lib/kubelet/pods/614d4398-61e7-4159-bcf2-a75e8c2c91fb/volumes" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.168737 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.169112 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200095 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-scripts\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200176 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200200 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-run-httpd\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200231 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-log-httpd\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200288 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200347 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-config-data\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200385 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvs86\" (UniqueName: \"kubernetes.io/projected/8cf1cab1-8021-4ea6-9688-93259f692624-kube-api-access-bvs86\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.200980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-run-httpd\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.201085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-log-httpd\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.206980 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.207963 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-scripts\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.208912 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-config-data\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.211160 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.211669 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.231275 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvs86\" (UniqueName: \"kubernetes.io/projected/8cf1cab1-8021-4ea6-9688-93259f692624-kube-api-access-bvs86\") pod \"ceilometer-0\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.311461 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.861664 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:34 crc kubenswrapper[4907]: I0226 16:07:34.927483 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerStarted","Data":"605d38c2446d4ec9c8c8f033ce8cb3566623d201b0f11dac6191667371fc43c2"} Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.239837 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.315491 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.348004 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.383856 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.383906 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.947569 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerStarted","Data":"56c71d665d99c1752ebace4b0e674b38b3ab4d8daaed18c0f4bbfebaea520de1"} Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.947882 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerStarted","Data":"6cb3984c9d3326ced22adb9698477c60db88c6495ef9c37e852b17be4ad7248d"} Feb 26 16:07:36 crc kubenswrapper[4907]: I0226 16:07:36.979342 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 16:07:37 crc kubenswrapper[4907]: I0226 16:07:37.465757 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:07:37 crc kubenswrapper[4907]: I0226 16:07:37.466003 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 16:07:37 crc kubenswrapper[4907]: I0226 16:07:37.749417 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Feb 26 16:07:37 crc kubenswrapper[4907]: I0226 16:07:37.959123 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerStarted","Data":"c47aec62f1ef99a7c489702f9f3cb298196db0729bb5d415c26747371ecd776e"} Feb 26 16:07:40 crc kubenswrapper[4907]: I0226 16:07:40.009920 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerStarted","Data":"6e7f088ab98e65eb3a338ca9092de26a8fa5a9e2ea620605768cf07e895a33ff"} Feb 26 16:07:40 crc kubenswrapper[4907]: I0226 16:07:40.010634 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:07:40 crc kubenswrapper[4907]: I0226 16:07:40.036555 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.723239736 podStartE2EDuration="7.036535088s" podCreationTimestamp="2026-02-26 16:07:33 +0000 UTC" firstStartedPulling="2026-02-26 16:07:34.887668437 +0000 UTC m=+1517.406230286" lastFinishedPulling="2026-02-26 16:07:39.200963799 +0000 UTC m=+1521.719525638" observedRunningTime="2026-02-26 16:07:40.030372148 +0000 UTC m=+1522.548933997" watchObservedRunningTime="2026-02-26 16:07:40.036535088 +0000 UTC m=+1522.555096937" Feb 26 16:07:40 crc kubenswrapper[4907]: I0226 16:07:40.286973 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 16:07:42 crc kubenswrapper[4907]: I0226 16:07:42.204084 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.126520 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.126890 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.131280 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.133500 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.926124 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.981437 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-combined-ca-bundle\") pod \"a4a6fb6f-387d-4765-8360-1570fa74a16e\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.981620 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-config-data\") pod \"a4a6fb6f-387d-4765-8360-1570fa74a16e\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.981643 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7qlk\" (UniqueName: \"kubernetes.io/projected/a4a6fb6f-387d-4765-8360-1570fa74a16e-kube-api-access-j7qlk\") pod \"a4a6fb6f-387d-4765-8360-1570fa74a16e\" (UID: \"a4a6fb6f-387d-4765-8360-1570fa74a16e\") " Feb 26 16:07:43 crc kubenswrapper[4907]: I0226 16:07:43.988615 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a6fb6f-387d-4765-8360-1570fa74a16e-kube-api-access-j7qlk" (OuterVolumeSpecName: "kube-api-access-j7qlk") pod "a4a6fb6f-387d-4765-8360-1570fa74a16e" (UID: "a4a6fb6f-387d-4765-8360-1570fa74a16e"). InnerVolumeSpecName "kube-api-access-j7qlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.011718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-config-data" (OuterVolumeSpecName: "config-data") pod "a4a6fb6f-387d-4765-8360-1570fa74a16e" (UID: "a4a6fb6f-387d-4765-8360-1570fa74a16e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.050326 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a6fb6f-387d-4765-8360-1570fa74a16e" (UID: "a4a6fb6f-387d-4765-8360-1570fa74a16e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.057631 4907 generic.go:334] "Generic (PLEG): container finished" podID="a4a6fb6f-387d-4765-8360-1570fa74a16e" containerID="11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2" exitCode=137 Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.058332 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.058392 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4a6fb6f-387d-4765-8360-1570fa74a16e","Type":"ContainerDied","Data":"11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2"} Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.058466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a4a6fb6f-387d-4765-8360-1570fa74a16e","Type":"ContainerDied","Data":"d48724b39b980e13296deadb81130fe0ac074c16990e2177f92168b64ed10d00"} Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.058488 4907 scope.go:117] "RemoveContainer" containerID="11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.084428 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.084481 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a6fb6f-387d-4765-8360-1570fa74a16e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.084492 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7qlk\" (UniqueName: \"kubernetes.io/projected/a4a6fb6f-387d-4765-8360-1570fa74a16e-kube-api-access-j7qlk\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.112662 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.129482 4907 scope.go:117] "RemoveContainer" containerID="11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2" Feb 26 16:07:44 crc kubenswrapper[4907]: E0226 16:07:44.135946 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2\": container with ID starting with 11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2 not found: ID does not exist" containerID="11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.136003 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2"} err="failed to get container status \"11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2\": rpc error: code = NotFound desc = could not find container \"11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2\": container with ID starting with 11ce89255fbbde7ffd3fa59b11b8d8d9480560084ad8c285e5b289210fdb92d2 not found: ID does not exist" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.158183 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.177518 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:44 crc kubenswrapper[4907]: E0226 16:07:44.177949 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a6fb6f-387d-4765-8360-1570fa74a16e" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.177967 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a6fb6f-387d-4765-8360-1570fa74a16e" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.178174 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a6fb6f-387d-4765-8360-1570fa74a16e" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.178791 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.182322 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.182670 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.182848 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.195814 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.288119 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.288182 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.288249 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.288418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfxx6\" (UniqueName: \"kubernetes.io/projected/f166b819-6d86-432a-a806-764338fb2687-kube-api-access-rfxx6\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.288483 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.390234 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfxx6\" (UniqueName: \"kubernetes.io/projected/f166b819-6d86-432a-a806-764338fb2687-kube-api-access-rfxx6\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.390367 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.390406 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.390429 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.390484 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.395766 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.395819 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.397529 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.398651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f166b819-6d86-432a-a806-764338fb2687-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.413092 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfxx6\" (UniqueName: \"kubernetes.io/projected/f166b819-6d86-432a-a806-764338fb2687-kube-api-access-rfxx6\") pod \"nova-cell1-novncproxy-0\" (UID: \"f166b819-6d86-432a-a806-764338fb2687\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.507829 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:44 crc kubenswrapper[4907]: I0226 16:07:44.980422 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 16:07:45 crc kubenswrapper[4907]: I0226 16:07:45.077311 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f166b819-6d86-432a-a806-764338fb2687","Type":"ContainerStarted","Data":"e31894c45773c07f818f49bb5b52d84fd1609d050c2dc680d595129a1f1f8b76"} Feb 26 16:07:45 crc kubenswrapper[4907]: I0226 16:07:45.146067 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-76d88967b8-wmzcw" Feb 26 16:07:45 crc kubenswrapper[4907]: I0226 16:07:45.238886 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fccfb8496-4tqhr"] Feb 26 16:07:45 crc kubenswrapper[4907]: I0226 16:07:45.243125 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" containerID="cri-o://9e39d9243d4cdfe57e174b7503dc46aaf0fae6d591c7d87a7c2c19a92a84a500" gracePeriod=30 Feb 26 16:07:45 crc kubenswrapper[4907]: I0226 16:07:45.239468 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fccfb8496-4tqhr" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon-log" containerID="cri-o://3f95094dd73a53aa831d3c7f002970271a280a470ee37d101788fd1290991f04" gracePeriod=30 Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.092340 4907 generic.go:334] "Generic (PLEG): container finished" podID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerID="9e39d9243d4cdfe57e174b7503dc46aaf0fae6d591c7d87a7c2c19a92a84a500" exitCode=0 Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.092540 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerDied","Data":"9e39d9243d4cdfe57e174b7503dc46aaf0fae6d591c7d87a7c2c19a92a84a500"} Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.092888 4907 scope.go:117] "RemoveContainer" containerID="0e9ea68de0c1e921e9ed4ee0e299561d11e0b96c063a8d42fd8a0ea1f0193bee" Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.097652 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f166b819-6d86-432a-a806-764338fb2687","Type":"ContainerStarted","Data":"14105acd22a8834d0752a16f358262444badf2728275a1731d0cbd5770a803de"} Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.134681 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.134658412 podStartE2EDuration="2.134658412s" podCreationTimestamp="2026-02-26 16:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:46.121145721 +0000 UTC m=+1528.639707600" watchObservedRunningTime="2026-02-26 16:07:46.134658412 +0000 UTC m=+1528.653220261" Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.143672 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a6fb6f-387d-4765-8360-1570fa74a16e" path="/var/lib/kubelet/pods/a4a6fb6f-387d-4765-8360-1570fa74a16e/volumes" Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.388445 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.388991 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.392848 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:07:46 crc kubenswrapper[4907]: I0226 16:07:46.393240 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.110817 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.115311 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.332700 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-chsg6"] Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.334297 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.356143 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-config\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.356273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.356310 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpqx5\" (UniqueName: \"kubernetes.io/projected/0ac59eb1-73ee-4a73-90bd-2273f03c9498-kube-api-access-hpqx5\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.356370 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.356428 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.356463 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.360001 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-chsg6"] Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.458552 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-config\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.458793 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.458836 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpqx5\" (UniqueName: \"kubernetes.io/projected/0ac59eb1-73ee-4a73-90bd-2273f03c9498-kube-api-access-hpqx5\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.458882 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.458928 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.458958 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.459553 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.459708 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.459806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.459909 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.460114 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-config\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.499316 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpqx5\" (UniqueName: \"kubernetes.io/projected/0ac59eb1-73ee-4a73-90bd-2273f03c9498-kube-api-access-hpqx5\") pod \"dnsmasq-dns-cd5cbd7b9-chsg6\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:47 crc kubenswrapper[4907]: I0226 16:07:47.653865 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:48 crc kubenswrapper[4907]: I0226 16:07:48.178292 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-chsg6"] Feb 26 16:07:48 crc kubenswrapper[4907]: W0226 16:07:48.191572 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ac59eb1_73ee_4a73_90bd_2273f03c9498.slice/crio-2db045074b31745129cb37d5ea0feebdf082577d83dccdfc2f1d94015c18aa37 WatchSource:0}: Error finding container 2db045074b31745129cb37d5ea0feebdf082577d83dccdfc2f1d94015c18aa37: Status 404 returned error can't find the container with id 2db045074b31745129cb37d5ea0feebdf082577d83dccdfc2f1d94015c18aa37 Feb 26 16:07:48 crc kubenswrapper[4907]: I0226 16:07:48.530760 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:07:48 crc kubenswrapper[4907]: I0226 16:07:48.531092 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.152000 4907 generic.go:334] "Generic (PLEG): container finished" podID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerID="32a8271fcc6373ab8b6b5d29e8bd868bdc62ced4a2431c88da81b21b00a0e2ce" exitCode=0 Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.154611 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" event={"ID":"0ac59eb1-73ee-4a73-90bd-2273f03c9498","Type":"ContainerDied","Data":"32a8271fcc6373ab8b6b5d29e8bd868bdc62ced4a2431c88da81b21b00a0e2ce"} Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.154654 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" event={"ID":"0ac59eb1-73ee-4a73-90bd-2273f03c9498","Type":"ContainerStarted","Data":"2db045074b31745129cb37d5ea0feebdf082577d83dccdfc2f1d94015c18aa37"} Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.508486 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.808231 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.932880 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.933230 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-central-agent" containerID="cri-o://6cb3984c9d3326ced22adb9698477c60db88c6495ef9c37e852b17be4ad7248d" gracePeriod=30 Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.933294 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="sg-core" containerID="cri-o://c47aec62f1ef99a7c489702f9f3cb298196db0729bb5d415c26747371ecd776e" gracePeriod=30 Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.933339 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-notification-agent" containerID="cri-o://56c71d665d99c1752ebace4b0e674b38b3ab4d8daaed18c0f4bbfebaea520de1" gracePeriod=30 Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.933370 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="proxy-httpd" containerID="cri-o://6e7f088ab98e65eb3a338ca9092de26a8fa5a9e2ea620605768cf07e895a33ff" gracePeriod=30 Feb 26 16:07:49 crc kubenswrapper[4907]: I0226 16:07:49.946556 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.205:3000/\": EOF" Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.163415 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" event={"ID":"0ac59eb1-73ee-4a73-90bd-2273f03c9498","Type":"ContainerStarted","Data":"ced4fc2d497ca439acbf014da0f32226afff2cd2aedba7c0f85f2022046ae4ce"} Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.163723 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.168989 4907 generic.go:334] "Generic (PLEG): container finished" podID="8cf1cab1-8021-4ea6-9688-93259f692624" containerID="6e7f088ab98e65eb3a338ca9092de26a8fa5a9e2ea620605768cf07e895a33ff" exitCode=0 Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.169197 4907 generic.go:334] "Generic (PLEG): container finished" podID="8cf1cab1-8021-4ea6-9688-93259f692624" containerID="c47aec62f1ef99a7c489702f9f3cb298196db0729bb5d415c26747371ecd776e" exitCode=2 Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.169215 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerDied","Data":"6e7f088ab98e65eb3a338ca9092de26a8fa5a9e2ea620605768cf07e895a33ff"} Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.169486 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-log" containerID="cri-o://6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834" gracePeriod=30 Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.169776 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-api" containerID="cri-o://8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f" gracePeriod=30 Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.170124 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerDied","Data":"c47aec62f1ef99a7c489702f9f3cb298196db0729bb5d415c26747371ecd776e"} Feb 26 16:07:50 crc kubenswrapper[4907]: I0226 16:07:50.198317 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" podStartSLOduration=3.198300761 podStartE2EDuration="3.198300761s" podCreationTimestamp="2026-02-26 16:07:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:50.192360875 +0000 UTC m=+1532.710922724" watchObservedRunningTime="2026-02-26 16:07:50.198300761 +0000 UTC m=+1532.716862610" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.196142 4907 generic.go:334] "Generic (PLEG): container finished" podID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerID="6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834" exitCode=143 Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.196841 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887","Type":"ContainerDied","Data":"6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834"} Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.200111 4907 generic.go:334] "Generic (PLEG): container finished" podID="8cf1cab1-8021-4ea6-9688-93259f692624" containerID="56c71d665d99c1752ebace4b0e674b38b3ab4d8daaed18c0f4bbfebaea520de1" exitCode=0 Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.200147 4907 generic.go:334] "Generic (PLEG): container finished" podID="8cf1cab1-8021-4ea6-9688-93259f692624" containerID="6cb3984c9d3326ced22adb9698477c60db88c6495ef9c37e852b17be4ad7248d" exitCode=0 Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.201119 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerDied","Data":"56c71d665d99c1752ebace4b0e674b38b3ab4d8daaed18c0f4bbfebaea520de1"} Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.201150 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerDied","Data":"6cb3984c9d3326ced22adb9698477c60db88c6495ef9c37e852b17be4ad7248d"} Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.315649 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.452549 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-ceilometer-tls-certs\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.452641 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-config-data\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.452733 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-log-httpd\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.453150 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.453203 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-combined-ca-bundle\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.453313 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvs86\" (UniqueName: \"kubernetes.io/projected/8cf1cab1-8021-4ea6-9688-93259f692624-kube-api-access-bvs86\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.453836 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-run-httpd\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.453920 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-scripts\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.454139 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.454017 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-sg-core-conf-yaml\") pod \"8cf1cab1-8021-4ea6-9688-93259f692624\" (UID: \"8cf1cab1-8021-4ea6-9688-93259f692624\") " Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.455103 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.455128 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8cf1cab1-8021-4ea6-9688-93259f692624-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.460850 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-scripts" (OuterVolumeSpecName: "scripts") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.491809 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf1cab1-8021-4ea6-9688-93259f692624-kube-api-access-bvs86" (OuterVolumeSpecName: "kube-api-access-bvs86") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "kube-api-access-bvs86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.499077 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.509093 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.556826 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.556856 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvs86\" (UniqueName: \"kubernetes.io/projected/8cf1cab1-8021-4ea6-9688-93259f692624-kube-api-access-bvs86\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.556866 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.556875 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.570829 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.575083 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-config-data" (OuterVolumeSpecName: "config-data") pod "8cf1cab1-8021-4ea6-9688-93259f692624" (UID: "8cf1cab1-8021-4ea6-9688-93259f692624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.658529 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:51 crc kubenswrapper[4907]: I0226 16:07:51.658578 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf1cab1-8021-4ea6-9688-93259f692624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.251148 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8cf1cab1-8021-4ea6-9688-93259f692624","Type":"ContainerDied","Data":"605d38c2446d4ec9c8c8f033ce8cb3566623d201b0f11dac6191667371fc43c2"} Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.251207 4907 scope.go:117] "RemoveContainer" containerID="6e7f088ab98e65eb3a338ca9092de26a8fa5a9e2ea620605768cf07e895a33ff" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.251257 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.292632 4907 scope.go:117] "RemoveContainer" containerID="c47aec62f1ef99a7c489702f9f3cb298196db0729bb5d415c26747371ecd776e" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.298877 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.313090 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.321174 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:52 crc kubenswrapper[4907]: E0226 16:07:52.321728 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-central-agent" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.321805 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-central-agent" Feb 26 16:07:52 crc kubenswrapper[4907]: E0226 16:07:52.321865 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="sg-core" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.321912 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="sg-core" Feb 26 16:07:52 crc kubenswrapper[4907]: E0226 16:07:52.322002 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-notification-agent" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.322050 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-notification-agent" Feb 26 16:07:52 crc kubenswrapper[4907]: E0226 16:07:52.322110 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="proxy-httpd" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.322158 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="proxy-httpd" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.322382 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="proxy-httpd" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.322445 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-central-agent" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.322501 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="sg-core" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.322552 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" containerName="ceilometer-notification-agent" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.324169 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.327021 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.327211 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.328866 4907 scope.go:117] "RemoveContainer" containerID="56c71d665d99c1752ebace4b0e674b38b3ab4d8daaed18c0f4bbfebaea520de1" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.329073 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.355738 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.362737 4907 scope.go:117] "RemoveContainer" containerID="6cb3984c9d3326ced22adb9698477c60db88c6495ef9c37e852b17be4ad7248d" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.481795 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-run-httpd\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482054 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482123 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482174 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482198 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-log-httpd\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482241 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mf6c\" (UniqueName: \"kubernetes.io/projected/6a9c347b-8497-423b-b352-a554ace86315-kube-api-access-4mf6c\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482392 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-scripts\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.482444 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-config-data\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.559547 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:52 crc kubenswrapper[4907]: E0226 16:07:52.560463 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-4mf6c log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6a9c347b-8497-423b-b352-a554ace86315" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-scripts\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584221 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-config-data\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584261 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-run-httpd\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584280 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584347 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584403 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-log-httpd\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.584486 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mf6c\" (UniqueName: \"kubernetes.io/projected/6a9c347b-8497-423b-b352-a554ace86315-kube-api-access-4mf6c\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.585413 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-log-httpd\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.585533 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-run-httpd\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.591865 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-scripts\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.591866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.592134 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.592333 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.592856 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-config-data\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:52 crc kubenswrapper[4907]: I0226 16:07:52.605878 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mf6c\" (UniqueName: \"kubernetes.io/projected/6a9c347b-8497-423b-b352-a554ace86315-kube-api-access-4mf6c\") pod \"ceilometer-0\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " pod="openstack/ceilometer-0" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.264409 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.281400 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398432 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-ceilometer-tls-certs\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398480 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-sg-core-conf-yaml\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398503 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mf6c\" (UniqueName: \"kubernetes.io/projected/6a9c347b-8497-423b-b352-a554ace86315-kube-api-access-4mf6c\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398617 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-config-data\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398706 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-run-httpd\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398732 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-scripts\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398843 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-log-httpd\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.398882 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-combined-ca-bundle\") pod \"6a9c347b-8497-423b-b352-a554ace86315\" (UID: \"6a9c347b-8497-423b-b352-a554ace86315\") " Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.399314 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.399631 4907 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.400249 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.404074 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-config-data" (OuterVolumeSpecName: "config-data") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.404839 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.405373 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-scripts" (OuterVolumeSpecName: "scripts") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.407248 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.407294 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.409604 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9c347b-8497-423b-b352-a554ace86315-kube-api-access-4mf6c" (OuterVolumeSpecName: "kube-api-access-4mf6c") pod "6a9c347b-8497-423b-b352-a554ace86315" (UID: "6a9c347b-8497-423b-b352-a554ace86315"). InnerVolumeSpecName "kube-api-access-4mf6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501625 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501655 4907 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a9c347b-8497-423b-b352-a554ace86315-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501665 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501675 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501686 4907 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501695 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mf6c\" (UniqueName: \"kubernetes.io/projected/6a9c347b-8497-423b-b352-a554ace86315-kube-api-access-4mf6c\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:53 crc kubenswrapper[4907]: I0226 16:07:53.501703 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a9c347b-8497-423b-b352-a554ace86315-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.069360 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.153426 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf1cab1-8021-4ea6-9688-93259f692624" path="/var/lib/kubelet/pods/8cf1cab1-8021-4ea6-9688-93259f692624/volumes" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.214481 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kqv9\" (UniqueName: \"kubernetes.io/projected/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-kube-api-access-9kqv9\") pod \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.214540 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-combined-ca-bundle\") pod \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.214826 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-config-data\") pod \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.215067 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-logs\") pod \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\" (UID: \"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887\") " Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.216337 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-logs" (OuterVolumeSpecName: "logs") pod "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" (UID: "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.219291 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-kube-api-access-9kqv9" (OuterVolumeSpecName: "kube-api-access-9kqv9") pod "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" (UID: "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887"). InnerVolumeSpecName "kube-api-access-9kqv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.254957 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" (UID: "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.266105 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-config-data" (OuterVolumeSpecName: "config-data") pod "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" (UID: "97a6cbcf-5a97-4fe9-b47f-a738fb4dc887"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.280217 4907 generic.go:334] "Generic (PLEG): container finished" podID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerID="8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f" exitCode=0 Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.280336 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.280798 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887","Type":"ContainerDied","Data":"8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f"} Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.280943 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"97a6cbcf-5a97-4fe9-b47f-a738fb4dc887","Type":"ContainerDied","Data":"d0d59c7247ddc9cd23b73f76b6f17ceb39c24defd837b8fe2f10ba40ce05c641"} Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.281003 4907 scope.go:117] "RemoveContainer" containerID="8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.281417 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.318351 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.318379 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.318388 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kqv9\" (UniqueName: \"kubernetes.io/projected/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-kube-api-access-9kqv9\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.318397 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.324445 4907 scope.go:117] "RemoveContainer" containerID="6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.362725 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.368762 4907 scope.go:117] "RemoveContainer" containerID="8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f" Feb 26 16:07:54 crc kubenswrapper[4907]: E0226 16:07:54.376301 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f\": container with ID starting with 8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f not found: ID does not exist" containerID="8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.376360 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f"} err="failed to get container status \"8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f\": rpc error: code = NotFound desc = could not find container \"8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f\": container with ID starting with 8cea64b2dc3a02bf0080e80c8901cc18401c4df5c498bdcd3603cb228113f61f not found: ID does not exist" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.376402 4907 scope.go:117] "RemoveContainer" containerID="6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834" Feb 26 16:07:54 crc kubenswrapper[4907]: E0226 16:07:54.376800 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834\": container with ID starting with 6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834 not found: ID does not exist" containerID="6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.376823 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834"} err="failed to get container status \"6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834\": rpc error: code = NotFound desc = could not find container \"6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834\": container with ID starting with 6cafbc58224f1c877a57f32b79a66009154fdec3e1944c18c1590f05fed67834 not found: ID does not exist" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.385491 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.422076 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.437287 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.450743 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: E0226 16:07:54.451173 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-api" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.451185 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-api" Feb 26 16:07:54 crc kubenswrapper[4907]: E0226 16:07:54.451206 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-log" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.451212 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-log" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.451501 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-api" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.451511 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" containerName="nova-api-log" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.453190 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.455984 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.456366 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.456676 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.458575 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.460631 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.462157 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.462779 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.463089 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.503486 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.508982 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.516987 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.541468 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.626813 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvjrg\" (UniqueName: \"kubernetes.io/projected/31a81b99-dba6-4f2e-95eb-09f66cdd28df-kube-api-access-cvjrg\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.626856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.626877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.626892 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-config-data\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.626936 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a81b99-dba6-4f2e-95eb-09f66cdd28df-run-httpd\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.626951 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3dc5070-d012-48d8-b12b-2064ebeab515-logs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627221 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627345 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-scripts\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627408 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/a3dc5070-d012-48d8-b12b-2064ebeab515-kube-api-access-w9fgr\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627446 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627515 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627614 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a81b99-dba6-4f2e-95eb-09f66cdd28df-log-httpd\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627641 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-config-data\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.627675 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729047 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a81b99-dba6-4f2e-95eb-09f66cdd28df-run-httpd\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729106 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3dc5070-d012-48d8-b12b-2064ebeab515-logs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729163 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729210 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-scripts\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729236 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/a3dc5070-d012-48d8-b12b-2064ebeab515-kube-api-access-w9fgr\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729258 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729336 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a81b99-dba6-4f2e-95eb-09f66cdd28df-log-httpd\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729352 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-config-data\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729374 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729394 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvjrg\" (UniqueName: \"kubernetes.io/projected/31a81b99-dba6-4f2e-95eb-09f66cdd28df-kube-api-access-cvjrg\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729409 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729425 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729438 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-config-data\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.729651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a81b99-dba6-4f2e-95eb-09f66cdd28df-run-httpd\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.731111 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31a81b99-dba6-4f2e-95eb-09f66cdd28df-log-httpd\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.731458 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3dc5070-d012-48d8-b12b-2064ebeab515-logs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.735790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-config-data\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.736382 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-scripts\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.736518 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-config-data\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.738305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-public-tls-certs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.738333 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.738844 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.740818 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.743377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/31a81b99-dba6-4f2e-95eb-09f66cdd28df-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.744535 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.747478 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvjrg\" (UniqueName: \"kubernetes.io/projected/31a81b99-dba6-4f2e-95eb-09f66cdd28df-kube-api-access-cvjrg\") pod \"ceilometer-0\" (UID: \"31a81b99-dba6-4f2e-95eb-09f66cdd28df\") " pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.754292 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/a3dc5070-d012-48d8-b12b-2064ebeab515-kube-api-access-w9fgr\") pod \"nova-api-0\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " pod="openstack/nova-api-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.772963 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 16:07:54 crc kubenswrapper[4907]: I0226 16:07:54.802068 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.314561 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.448024 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.480077 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.556895 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-bck8j"] Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.558790 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.571330 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.571490 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.595751 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bck8j"] Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.650189 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hgcg\" (UniqueName: \"kubernetes.io/projected/f495d649-0f7a-4520-84ad-7703ad452593-kube-api-access-7hgcg\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.650266 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-config-data\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.650316 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.650377 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-scripts\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.751956 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hgcg\" (UniqueName: \"kubernetes.io/projected/f495d649-0f7a-4520-84ad-7703ad452593-kube-api-access-7hgcg\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.752012 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-config-data\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.752079 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.752130 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-scripts\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.755866 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-config-data\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.756159 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.760246 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-scripts\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.770386 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hgcg\" (UniqueName: \"kubernetes.io/projected/f495d649-0f7a-4520-84ad-7703ad452593-kube-api-access-7hgcg\") pod \"nova-cell1-cell-mapping-bck8j\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:55 crc kubenswrapper[4907]: I0226 16:07:55.905740 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.169419 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9c347b-8497-423b-b352-a554ace86315" path="/var/lib/kubelet/pods/6a9c347b-8497-423b-b352-a554ace86315/volumes" Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.170143 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a6cbcf-5a97-4fe9-b47f-a738fb4dc887" path="/var/lib/kubelet/pods/97a6cbcf-5a97-4fe9-b47f-a738fb4dc887/volumes" Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.314779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3dc5070-d012-48d8-b12b-2064ebeab515","Type":"ContainerStarted","Data":"8fb9e3887f82d1689589f4f1e765a4e10f20c9f3bc9c9d6a3ce74fd06a7748e7"} Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.315146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3dc5070-d012-48d8-b12b-2064ebeab515","Type":"ContainerStarted","Data":"be21cf892883dee07ecafe5ad17d28a3ec2865f1d72f47aaffd1357676fd7d5b"} Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.315166 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3dc5070-d012-48d8-b12b-2064ebeab515","Type":"ContainerStarted","Data":"e35d11b063f9349a8dd0432ff0f90f036a966d4216dd6053a97938971a0bb856"} Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.322634 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a81b99-dba6-4f2e-95eb-09f66cdd28df","Type":"ContainerStarted","Data":"cc9fd15667bf9c99d87c960e27f01d0870d284e7c9dfe2a7dba287971ad587d6"} Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.322680 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a81b99-dba6-4f2e-95eb-09f66cdd28df","Type":"ContainerStarted","Data":"267ef8659c2ecc7e56f76f75053eeafc86eb97eafce0b49cc8f2b5133ec12aa0"} Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.346328 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.346306406 podStartE2EDuration="2.346306406s" podCreationTimestamp="2026-02-26 16:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:56.344130203 +0000 UTC m=+1538.862692052" watchObservedRunningTime="2026-02-26 16:07:56.346306406 +0000 UTC m=+1538.864868255" Feb 26 16:07:56 crc kubenswrapper[4907]: I0226 16:07:56.450795 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-bck8j"] Feb 26 16:07:56 crc kubenswrapper[4907]: W0226 16:07:56.458486 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf495d649_0f7a_4520_84ad_7703ad452593.slice/crio-a6bd020bdc2e890cf0609312f1f0d15c8f13808acd1e8209e36f1e6ee22e8fd0 WatchSource:0}: Error finding container a6bd020bdc2e890cf0609312f1f0d15c8f13808acd1e8209e36f1e6ee22e8fd0: Status 404 returned error can't find the container with id a6bd020bdc2e890cf0609312f1f0d15c8f13808acd1e8209e36f1e6ee22e8fd0 Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.333050 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bck8j" event={"ID":"f495d649-0f7a-4520-84ad-7703ad452593","Type":"ContainerStarted","Data":"14b832f80d0c811e1ff07311ab483873a14459b4798c7d0d6f09f8e79617e33d"} Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.333416 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bck8j" event={"ID":"f495d649-0f7a-4520-84ad-7703ad452593","Type":"ContainerStarted","Data":"a6bd020bdc2e890cf0609312f1f0d15c8f13808acd1e8209e36f1e6ee22e8fd0"} Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.335658 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a81b99-dba6-4f2e-95eb-09f66cdd28df","Type":"ContainerStarted","Data":"5460a0a4f5b1c42acce9b3aadcc5d5f79e9c1cde36536b07e4f15908f0b6d60b"} Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.656866 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.685109 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-bck8j" podStartSLOduration=2.685094837 podStartE2EDuration="2.685094837s" podCreationTimestamp="2026-02-26 16:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:07:57.354858941 +0000 UTC m=+1539.873420800" watchObservedRunningTime="2026-02-26 16:07:57.685094837 +0000 UTC m=+1540.203656686" Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.745962 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-zlwx8"] Feb 26 16:07:57 crc kubenswrapper[4907]: I0226 16:07:57.746199 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerName="dnsmasq-dns" containerID="cri-o://e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927" gracePeriod=10 Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.324722 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.344678 4907 generic.go:334] "Generic (PLEG): container finished" podID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerID="e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927" exitCode=0 Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.344770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" event={"ID":"9d796571-e8ef-42f7-bf25-96d758b8b32b","Type":"ContainerDied","Data":"e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927"} Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.344800 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" event={"ID":"9d796571-e8ef-42f7-bf25-96d758b8b32b","Type":"ContainerDied","Data":"e7c726956f42c01947a2cf697394976416b6538e6ccdb8386b1d31067bc48d5f"} Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.344815 4907 scope.go:117] "RemoveContainer" containerID="e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.345763 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-zlwx8" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.353710 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a81b99-dba6-4f2e-95eb-09f66cdd28df","Type":"ContainerStarted","Data":"ccde7093323efc8a9b570fb441d938de2e62d1f4a044e27a68a296ba422fab25"} Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.403458 4907 scope.go:117] "RemoveContainer" containerID="f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.410418 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-sb\") pod \"9d796571-e8ef-42f7-bf25-96d758b8b32b\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.410451 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-config\") pod \"9d796571-e8ef-42f7-bf25-96d758b8b32b\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.410572 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5tf4\" (UniqueName: \"kubernetes.io/projected/9d796571-e8ef-42f7-bf25-96d758b8b32b-kube-api-access-p5tf4\") pod \"9d796571-e8ef-42f7-bf25-96d758b8b32b\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.410641 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-nb\") pod \"9d796571-e8ef-42f7-bf25-96d758b8b32b\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.410717 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-svc\") pod \"9d796571-e8ef-42f7-bf25-96d758b8b32b\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.410750 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-swift-storage-0\") pod \"9d796571-e8ef-42f7-bf25-96d758b8b32b\" (UID: \"9d796571-e8ef-42f7-bf25-96d758b8b32b\") " Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.428290 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d796571-e8ef-42f7-bf25-96d758b8b32b-kube-api-access-p5tf4" (OuterVolumeSpecName: "kube-api-access-p5tf4") pod "9d796571-e8ef-42f7-bf25-96d758b8b32b" (UID: "9d796571-e8ef-42f7-bf25-96d758b8b32b"). InnerVolumeSpecName "kube-api-access-p5tf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.472874 4907 scope.go:117] "RemoveContainer" containerID="e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927" Feb 26 16:07:58 crc kubenswrapper[4907]: E0226 16:07:58.482006 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927\": container with ID starting with e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927 not found: ID does not exist" containerID="e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.482057 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927"} err="failed to get container status \"e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927\": rpc error: code = NotFound desc = could not find container \"e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927\": container with ID starting with e45c0b511780f5b74c3a191d46b53e40644b3c1baf4e3abd973fb26d216b0927 not found: ID does not exist" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.482086 4907 scope.go:117] "RemoveContainer" containerID="f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd" Feb 26 16:07:58 crc kubenswrapper[4907]: E0226 16:07:58.482391 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd\": container with ID starting with f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd not found: ID does not exist" containerID="f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.482417 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd"} err="failed to get container status \"f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd\": rpc error: code = NotFound desc = could not find container \"f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd\": container with ID starting with f283ce6a8dffca39a5e8e4d939c2703aa4d4bff4a834d31701a6f570571b2bbd not found: ID does not exist" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.513608 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5tf4\" (UniqueName: \"kubernetes.io/projected/9d796571-e8ef-42f7-bf25-96d758b8b32b-kube-api-access-p5tf4\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.531878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d796571-e8ef-42f7-bf25-96d758b8b32b" (UID: "9d796571-e8ef-42f7-bf25-96d758b8b32b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.573417 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d796571-e8ef-42f7-bf25-96d758b8b32b" (UID: "9d796571-e8ef-42f7-bf25-96d758b8b32b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.589570 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d796571-e8ef-42f7-bf25-96d758b8b32b" (UID: "9d796571-e8ef-42f7-bf25-96d758b8b32b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.605019 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d796571-e8ef-42f7-bf25-96d758b8b32b" (UID: "9d796571-e8ef-42f7-bf25-96d758b8b32b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.611051 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-config" (OuterVolumeSpecName: "config") pod "9d796571-e8ef-42f7-bf25-96d758b8b32b" (UID: "9d796571-e8ef-42f7-bf25-96d758b8b32b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.615287 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.615324 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.615338 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.615353 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.615364 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d796571-e8ef-42f7-bf25-96d758b8b32b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.721119 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-zlwx8"] Feb 26 16:07:58 crc kubenswrapper[4907]: I0226 16:07:58.735448 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-zlwx8"] Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.140020 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" path="/var/lib/kubelet/pods/9d796571-e8ef-42f7-bf25-96d758b8b32b/volumes" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.141232 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535368-lf7wr"] Feb 26 16:08:00 crc kubenswrapper[4907]: E0226 16:08:00.141663 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerName="init" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.141687 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerName="init" Feb 26 16:08:00 crc kubenswrapper[4907]: E0226 16:08:00.141729 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerName="dnsmasq-dns" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.141740 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerName="dnsmasq-dns" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.141968 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d796571-e8ef-42f7-bf25-96d758b8b32b" containerName="dnsmasq-dns" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.147001 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.150025 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.150441 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.150642 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.153080 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-lf7wr"] Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.254167 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw54h\" (UniqueName: \"kubernetes.io/projected/d9dc4728-d6d1-46da-a355-56e615849c42-kube-api-access-jw54h\") pod \"auto-csr-approver-29535368-lf7wr\" (UID: \"d9dc4728-d6d1-46da-a355-56e615849c42\") " pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.356577 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw54h\" (UniqueName: \"kubernetes.io/projected/d9dc4728-d6d1-46da-a355-56e615849c42-kube-api-access-jw54h\") pod \"auto-csr-approver-29535368-lf7wr\" (UID: \"d9dc4728-d6d1-46da-a355-56e615849c42\") " pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.373855 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31a81b99-dba6-4f2e-95eb-09f66cdd28df","Type":"ContainerStarted","Data":"d2425a626b27ec6af96a5fa693fc8e34180ad4f32efc0f163cf243bf34b702d4"} Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.374379 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.386785 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw54h\" (UniqueName: \"kubernetes.io/projected/d9dc4728-d6d1-46da-a355-56e615849c42-kube-api-access-jw54h\") pod \"auto-csr-approver-29535368-lf7wr\" (UID: \"d9dc4728-d6d1-46da-a355-56e615849c42\") " pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.401468 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.167431406 podStartE2EDuration="6.401450047s" podCreationTimestamp="2026-02-26 16:07:54 +0000 UTC" firstStartedPulling="2026-02-26 16:07:55.439487212 +0000 UTC m=+1537.958049061" lastFinishedPulling="2026-02-26 16:07:59.673505853 +0000 UTC m=+1542.192067702" observedRunningTime="2026-02-26 16:08:00.395502162 +0000 UTC m=+1542.914064021" watchObservedRunningTime="2026-02-26 16:08:00.401450047 +0000 UTC m=+1542.920011896" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.516027 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:00 crc kubenswrapper[4907]: I0226 16:08:00.993034 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-lf7wr"] Feb 26 16:08:00 crc kubenswrapper[4907]: W0226 16:08:00.997585 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9dc4728_d6d1_46da_a355_56e615849c42.slice/crio-f0c64b55fb761afb2454a5c7e9bd1041cbd1fa5832934347d042ef56735ba9cb WatchSource:0}: Error finding container f0c64b55fb761afb2454a5c7e9bd1041cbd1fa5832934347d042ef56735ba9cb: Status 404 returned error can't find the container with id f0c64b55fb761afb2454a5c7e9bd1041cbd1fa5832934347d042ef56735ba9cb Feb 26 16:08:01 crc kubenswrapper[4907]: I0226 16:08:01.386548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" event={"ID":"d9dc4728-d6d1-46da-a355-56e615849c42","Type":"ContainerStarted","Data":"f0c64b55fb761afb2454a5c7e9bd1041cbd1fa5832934347d042ef56735ba9cb"} Feb 26 16:08:02 crc kubenswrapper[4907]: I0226 16:08:02.399748 4907 generic.go:334] "Generic (PLEG): container finished" podID="f495d649-0f7a-4520-84ad-7703ad452593" containerID="14b832f80d0c811e1ff07311ab483873a14459b4798c7d0d6f09f8e79617e33d" exitCode=0 Feb 26 16:08:02 crc kubenswrapper[4907]: I0226 16:08:02.399819 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bck8j" event={"ID":"f495d649-0f7a-4520-84ad-7703ad452593","Type":"ContainerDied","Data":"14b832f80d0c811e1ff07311ab483873a14459b4798c7d0d6f09f8e79617e33d"} Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.414502 4907 generic.go:334] "Generic (PLEG): container finished" podID="d9dc4728-d6d1-46da-a355-56e615849c42" containerID="0edc709be2aa57fb78580ed2af925f93ac7a5217c56cc184072271a2659984f1" exitCode=0 Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.414694 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" event={"ID":"d9dc4728-d6d1-46da-a355-56e615849c42","Type":"ContainerDied","Data":"0edc709be2aa57fb78580ed2af925f93ac7a5217c56cc184072271a2659984f1"} Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.826632 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.954215 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-config-data\") pod \"f495d649-0f7a-4520-84ad-7703ad452593\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.955060 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-scripts\") pod \"f495d649-0f7a-4520-84ad-7703ad452593\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.955297 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-combined-ca-bundle\") pod \"f495d649-0f7a-4520-84ad-7703ad452593\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.955962 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hgcg\" (UniqueName: \"kubernetes.io/projected/f495d649-0f7a-4520-84ad-7703ad452593-kube-api-access-7hgcg\") pod \"f495d649-0f7a-4520-84ad-7703ad452593\" (UID: \"f495d649-0f7a-4520-84ad-7703ad452593\") " Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.960358 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f495d649-0f7a-4520-84ad-7703ad452593-kube-api-access-7hgcg" (OuterVolumeSpecName: "kube-api-access-7hgcg") pod "f495d649-0f7a-4520-84ad-7703ad452593" (UID: "f495d649-0f7a-4520-84ad-7703ad452593"). InnerVolumeSpecName "kube-api-access-7hgcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.964206 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-scripts" (OuterVolumeSpecName: "scripts") pod "f495d649-0f7a-4520-84ad-7703ad452593" (UID: "f495d649-0f7a-4520-84ad-7703ad452593"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.990410 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f495d649-0f7a-4520-84ad-7703ad452593" (UID: "f495d649-0f7a-4520-84ad-7703ad452593"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:03 crc kubenswrapper[4907]: I0226 16:08:03.990972 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-config-data" (OuterVolumeSpecName: "config-data") pod "f495d649-0f7a-4520-84ad-7703ad452593" (UID: "f495d649-0f7a-4520-84ad-7703ad452593"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.058670 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.058713 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.058724 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hgcg\" (UniqueName: \"kubernetes.io/projected/f495d649-0f7a-4520-84ad-7703ad452593-kube-api-access-7hgcg\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.058733 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f495d649-0f7a-4520-84ad-7703ad452593-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.426220 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-bck8j" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.427315 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-bck8j" event={"ID":"f495d649-0f7a-4520-84ad-7703ad452593","Type":"ContainerDied","Data":"a6bd020bdc2e890cf0609312f1f0d15c8f13808acd1e8209e36f1e6ee22e8fd0"} Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.427339 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6bd020bdc2e890cf0609312f1f0d15c8f13808acd1e8209e36f1e6ee22e8fd0" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.652869 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.653070 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="162c5aed-9a98-49ed-a628-efc7c67b82a4" containerName="nova-scheduler-scheduler" containerID="cri-o://52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" gracePeriod=30 Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.730658 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.730891 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-log" containerID="cri-o://be21cf892883dee07ecafe5ad17d28a3ec2865f1d72f47aaffd1357676fd7d5b" gracePeriod=30 Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.731363 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-api" containerID="cri-o://8fb9e3887f82d1689589f4f1e765a4e10f20c9f3bc9c9d6a3ce74fd06a7748e7" gracePeriod=30 Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.769247 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.769499 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-log" containerID="cri-o://230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245" gracePeriod=30 Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.769723 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-metadata" containerID="cri-o://d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a" gracePeriod=30 Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.802608 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.882395 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw54h\" (UniqueName: \"kubernetes.io/projected/d9dc4728-d6d1-46da-a355-56e615849c42-kube-api-access-jw54h\") pod \"d9dc4728-d6d1-46da-a355-56e615849c42\" (UID: \"d9dc4728-d6d1-46da-a355-56e615849c42\") " Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.900052 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9dc4728-d6d1-46da-a355-56e615849c42-kube-api-access-jw54h" (OuterVolumeSpecName: "kube-api-access-jw54h") pod "d9dc4728-d6d1-46da-a355-56e615849c42" (UID: "d9dc4728-d6d1-46da-a355-56e615849c42"). InnerVolumeSpecName "kube-api-access-jw54h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:04 crc kubenswrapper[4907]: I0226 16:08:04.989979 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw54h\" (UniqueName: \"kubernetes.io/projected/d9dc4728-d6d1-46da-a355-56e615849c42-kube-api-access-jw54h\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.442712 4907 generic.go:334] "Generic (PLEG): container finished" podID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerID="8fb9e3887f82d1689589f4f1e765a4e10f20c9f3bc9c9d6a3ce74fd06a7748e7" exitCode=0 Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.442771 4907 generic.go:334] "Generic (PLEG): container finished" podID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerID="be21cf892883dee07ecafe5ad17d28a3ec2865f1d72f47aaffd1357676fd7d5b" exitCode=143 Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.442809 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3dc5070-d012-48d8-b12b-2064ebeab515","Type":"ContainerDied","Data":"8fb9e3887f82d1689589f4f1e765a4e10f20c9f3bc9c9d6a3ce74fd06a7748e7"} Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.442845 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3dc5070-d012-48d8-b12b-2064ebeab515","Type":"ContainerDied","Data":"be21cf892883dee07ecafe5ad17d28a3ec2865f1d72f47aaffd1357676fd7d5b"} Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.444691 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" event={"ID":"d9dc4728-d6d1-46da-a355-56e615849c42","Type":"ContainerDied","Data":"f0c64b55fb761afb2454a5c7e9bd1041cbd1fa5832934347d042ef56735ba9cb"} Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.444711 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0c64b55fb761afb2454a5c7e9bd1041cbd1fa5832934347d042ef56735ba9cb" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.444794 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535368-lf7wr" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.454252 4907 generic.go:334] "Generic (PLEG): container finished" podID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerID="230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245" exitCode=143 Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.454967 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bfed0ca-af76-4ba2-8be4-84716902175b","Type":"ContainerDied","Data":"230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245"} Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.680744 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.809424 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-public-tls-certs\") pod \"a3dc5070-d012-48d8-b12b-2064ebeab515\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.809529 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/a3dc5070-d012-48d8-b12b-2064ebeab515-kube-api-access-w9fgr\") pod \"a3dc5070-d012-48d8-b12b-2064ebeab515\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.809724 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3dc5070-d012-48d8-b12b-2064ebeab515-logs\") pod \"a3dc5070-d012-48d8-b12b-2064ebeab515\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.809817 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-internal-tls-certs\") pod \"a3dc5070-d012-48d8-b12b-2064ebeab515\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.809865 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-combined-ca-bundle\") pod \"a3dc5070-d012-48d8-b12b-2064ebeab515\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.809902 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-config-data\") pod \"a3dc5070-d012-48d8-b12b-2064ebeab515\" (UID: \"a3dc5070-d012-48d8-b12b-2064ebeab515\") " Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.810718 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dc5070-d012-48d8-b12b-2064ebeab515-logs" (OuterVolumeSpecName: "logs") pod "a3dc5070-d012-48d8-b12b-2064ebeab515" (UID: "a3dc5070-d012-48d8-b12b-2064ebeab515"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.839892 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dc5070-d012-48d8-b12b-2064ebeab515-kube-api-access-w9fgr" (OuterVolumeSpecName: "kube-api-access-w9fgr") pod "a3dc5070-d012-48d8-b12b-2064ebeab515" (UID: "a3dc5070-d012-48d8-b12b-2064ebeab515"). InnerVolumeSpecName "kube-api-access-w9fgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.890016 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-config-data" (OuterVolumeSpecName: "config-data") pod "a3dc5070-d012-48d8-b12b-2064ebeab515" (UID: "a3dc5070-d012-48d8-b12b-2064ebeab515"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.903806 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-gch27"] Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.906532 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3dc5070-d012-48d8-b12b-2064ebeab515" (UID: "a3dc5070-d012-48d8-b12b-2064ebeab515"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.912219 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a3dc5070-d012-48d8-b12b-2064ebeab515" (UID: "a3dc5070-d012-48d8-b12b-2064ebeab515"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.912440 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.912468 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.912480 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fgr\" (UniqueName: \"kubernetes.io/projected/a3dc5070-d012-48d8-b12b-2064ebeab515-kube-api-access-w9fgr\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.912493 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3dc5070-d012-48d8-b12b-2064ebeab515-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.912504 4907 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.916914 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535362-gch27"] Feb 26 16:08:05 crc kubenswrapper[4907]: I0226 16:08:05.923837 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a3dc5070-d012-48d8-b12b-2064ebeab515" (UID: "a3dc5070-d012-48d8-b12b-2064ebeab515"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.014664 4907 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3dc5070-d012-48d8-b12b-2064ebeab515-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.143799 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0a1c21-7e2d-4053-b478-d6c6387f88d5" path="/var/lib/kubelet/pods/1a0a1c21-7e2d-4053-b478-d6c6387f88d5/volumes" Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.317520 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.319049 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.320334 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.320389 4907 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="162c5aed-9a98-49ed-a628-efc7c67b82a4" containerName="nova-scheduler-scheduler" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.466083 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a3dc5070-d012-48d8-b12b-2064ebeab515","Type":"ContainerDied","Data":"e35d11b063f9349a8dd0432ff0f90f036a966d4216dd6053a97938971a0bb856"} Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.466154 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.466164 4907 scope.go:117] "RemoveContainer" containerID="8fb9e3887f82d1689589f4f1e765a4e10f20c9f3bc9c9d6a3ce74fd06a7748e7" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.603369 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.621099 4907 scope.go:117] "RemoveContainer" containerID="be21cf892883dee07ecafe5ad17d28a3ec2865f1d72f47aaffd1357676fd7d5b" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.628363 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637113 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.637498 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-log" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637514 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-log" Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.637537 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f495d649-0f7a-4520-84ad-7703ad452593" containerName="nova-manage" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637543 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f495d649-0f7a-4520-84ad-7703ad452593" containerName="nova-manage" Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.637562 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-api" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637569 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-api" Feb 26 16:08:06 crc kubenswrapper[4907]: E0226 16:08:06.637625 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9dc4728-d6d1-46da-a355-56e615849c42" containerName="oc" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637631 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9dc4728-d6d1-46da-a355-56e615849c42" containerName="oc" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637787 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9dc4728-d6d1-46da-a355-56e615849c42" containerName="oc" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637799 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-api" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637806 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f495d649-0f7a-4520-84ad-7703ad452593" containerName="nova-manage" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.637821 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" containerName="nova-api-log" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.638746 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.643175 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.643391 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.643442 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.659583 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.736730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674c61cb-49ef-4710-b83f-0374acf42f6a-logs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.736820 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.736865 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.736910 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-config-data\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.736979 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2x8\" (UniqueName: \"kubernetes.io/projected/674c61cb-49ef-4710-b83f-0374acf42f6a-kube-api-access-9x2x8\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.737007 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-public-tls-certs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.748437 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.840944 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5z5\" (UniqueName: \"kubernetes.io/projected/162c5aed-9a98-49ed-a628-efc7c67b82a4-kube-api-access-bt5z5\") pod \"162c5aed-9a98-49ed-a628-efc7c67b82a4\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-combined-ca-bundle\") pod \"162c5aed-9a98-49ed-a628-efc7c67b82a4\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841372 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-config-data\") pod \"162c5aed-9a98-49ed-a628-efc7c67b82a4\" (UID: \"162c5aed-9a98-49ed-a628-efc7c67b82a4\") " Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841724 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841786 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-config-data\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2x8\" (UniqueName: \"kubernetes.io/projected/674c61cb-49ef-4710-b83f-0374acf42f6a-kube-api-access-9x2x8\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841897 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-public-tls-certs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.841941 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674c61cb-49ef-4710-b83f-0374acf42f6a-logs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.842006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.844890 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/674c61cb-49ef-4710-b83f-0374acf42f6a-logs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.850279 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/162c5aed-9a98-49ed-a628-efc7c67b82a4-kube-api-access-bt5z5" (OuterVolumeSpecName: "kube-api-access-bt5z5") pod "162c5aed-9a98-49ed-a628-efc7c67b82a4" (UID: "162c5aed-9a98-49ed-a628-efc7c67b82a4"). InnerVolumeSpecName "kube-api-access-bt5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.851367 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-public-tls-certs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.853686 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.854245 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.860279 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/674c61cb-49ef-4710-b83f-0374acf42f6a-config-data\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.868069 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2x8\" (UniqueName: \"kubernetes.io/projected/674c61cb-49ef-4710-b83f-0374acf42f6a-kube-api-access-9x2x8\") pod \"nova-api-0\" (UID: \"674c61cb-49ef-4710-b83f-0374acf42f6a\") " pod="openstack/nova-api-0" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.875126 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-config-data" (OuterVolumeSpecName: "config-data") pod "162c5aed-9a98-49ed-a628-efc7c67b82a4" (UID: "162c5aed-9a98-49ed-a628-efc7c67b82a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.886750 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "162c5aed-9a98-49ed-a628-efc7c67b82a4" (UID: "162c5aed-9a98-49ed-a628-efc7c67b82a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.945184 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.945235 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/162c5aed-9a98-49ed-a628-efc7c67b82a4-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.945247 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt5z5\" (UniqueName: \"kubernetes.io/projected/162c5aed-9a98-49ed-a628-efc7c67b82a4-kube-api-access-bt5z5\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:06 crc kubenswrapper[4907]: I0226 16:08:06.975464 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 16:08:07 crc kubenswrapper[4907]: W0226 16:08:07.455292 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674c61cb_49ef_4710_b83f_0374acf42f6a.slice/crio-7e16c2f50909a5982176b65b5e7320aacd4c779153e43e4acdb3946ee1e062aa WatchSource:0}: Error finding container 7e16c2f50909a5982176b65b5e7320aacd4c779153e43e4acdb3946ee1e062aa: Status 404 returned error can't find the container with id 7e16c2f50909a5982176b65b5e7320aacd4c779153e43e4acdb3946ee1e062aa Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.458433 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.476944 4907 generic.go:334] "Generic (PLEG): container finished" podID="162c5aed-9a98-49ed-a628-efc7c67b82a4" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" exitCode=0 Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.477016 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"162c5aed-9a98-49ed-a628-efc7c67b82a4","Type":"ContainerDied","Data":"52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5"} Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.477062 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"162c5aed-9a98-49ed-a628-efc7c67b82a4","Type":"ContainerDied","Data":"46e008a70d3de709d410ef6bc0acbcd17f73ac5fd21ce54e7dd6fc3d2b5088e8"} Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.477085 4907 scope.go:117] "RemoveContainer" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.477079 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.478464 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"674c61cb-49ef-4710-b83f-0374acf42f6a","Type":"ContainerStarted","Data":"7e16c2f50909a5982176b65b5e7320aacd4c779153e43e4acdb3946ee1e062aa"} Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.499123 4907 scope.go:117] "RemoveContainer" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" Feb 26 16:08:07 crc kubenswrapper[4907]: E0226 16:08:07.499444 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5\": container with ID starting with 52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5 not found: ID does not exist" containerID="52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.499472 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5"} err="failed to get container status \"52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5\": rpc error: code = NotFound desc = could not find container \"52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5\": container with ID starting with 52389a7c75f19210a7ed4d721f7bfea7bf6605f9f9f2bece8685c11e13dc35a5 not found: ID does not exist" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.523025 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.536147 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.549901 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:08:07 crc kubenswrapper[4907]: E0226 16:08:07.550404 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="162c5aed-9a98-49ed-a628-efc7c67b82a4" containerName="nova-scheduler-scheduler" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.550431 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="162c5aed-9a98-49ed-a628-efc7c67b82a4" containerName="nova-scheduler-scheduler" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.550732 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="162c5aed-9a98-49ed-a628-efc7c67b82a4" containerName="nova-scheduler-scheduler" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.551421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.553344 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.565100 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.569325 4907 scope.go:117] "RemoveContainer" containerID="35b2e94e854ed9ebd2c97ada2a337dec8a20b1c6c7318b2961b0aeccfa450544" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.661441 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c994f627-1f02-468c-9651-19ac6a8728b4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.662002 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fvmr\" (UniqueName: \"kubernetes.io/projected/c994f627-1f02-468c-9651-19ac6a8728b4-kube-api-access-7fvmr\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.662147 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c994f627-1f02-468c-9651-19ac6a8728b4-config-data\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.766030 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c994f627-1f02-468c-9651-19ac6a8728b4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.766102 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvmr\" (UniqueName: \"kubernetes.io/projected/c994f627-1f02-468c-9651-19ac6a8728b4-kube-api-access-7fvmr\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.766133 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c994f627-1f02-468c-9651-19ac6a8728b4-config-data\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.769926 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c994f627-1f02-468c-9651-19ac6a8728b4-config-data\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.772365 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c994f627-1f02-468c-9651-19ac6a8728b4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.786959 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fvmr\" (UniqueName: \"kubernetes.io/projected/c994f627-1f02-468c-9651-19ac6a8728b4-kube-api-access-7fvmr\") pod \"nova-scheduler-0\" (UID: \"c994f627-1f02-468c-9651-19ac6a8728b4\") " pod="openstack/nova-scheduler-0" Feb 26 16:08:07 crc kubenswrapper[4907]: I0226 16:08:07.896309 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.122814 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": dial tcp 10.217.0.200:8775: connect: connection refused" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.122814 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": dial tcp 10.217.0.200:8775: connect: connection refused" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.145795 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="162c5aed-9a98-49ed-a628-efc7c67b82a4" path="/var/lib/kubelet/pods/162c5aed-9a98-49ed-a628-efc7c67b82a4/volumes" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.146502 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dc5070-d012-48d8-b12b-2064ebeab515" path="/var/lib/kubelet/pods/a3dc5070-d012-48d8-b12b-2064ebeab515/volumes" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.416128 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.481028 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.535128 4907 generic.go:334] "Generic (PLEG): container finished" podID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerID="d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a" exitCode=0 Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.535213 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bfed0ca-af76-4ba2-8be4-84716902175b","Type":"ContainerDied","Data":"d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a"} Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.535243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3bfed0ca-af76-4ba2-8be4-84716902175b","Type":"ContainerDied","Data":"81f061cda32d1af272950b98a178ad698e358a6c9bffed3c34c646d2cfbbbe68"} Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.535272 4907 scope.go:117] "RemoveContainer" containerID="d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.535846 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.540044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"674c61cb-49ef-4710-b83f-0374acf42f6a","Type":"ContainerStarted","Data":"c1316fa523fe181589e7d432d393dc4ad8f58c507c7e408f5196a11bea6d3cad"} Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.540082 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"674c61cb-49ef-4710-b83f-0374acf42f6a","Type":"ContainerStarted","Data":"63f994fbf588d37f21ed8ed7104abaa4c0955a2158c3c3ae099c548a2c818ab6"} Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.542286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c994f627-1f02-468c-9651-19ac6a8728b4","Type":"ContainerStarted","Data":"785ed3cd35b2afcfd08248d95be422abd7f915c60ae023303202dbb9a23cad12"} Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.565512 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.565498386 podStartE2EDuration="2.565498386s" podCreationTimestamp="2026-02-26 16:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:08:08.563622909 +0000 UTC m=+1551.082184868" watchObservedRunningTime="2026-02-26 16:08:08.565498386 +0000 UTC m=+1551.084060235" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.587397 4907 scope.go:117] "RemoveContainer" containerID="230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.603375 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-nova-metadata-tls-certs\") pod \"3bfed0ca-af76-4ba2-8be4-84716902175b\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.603789 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwvf9\" (UniqueName: \"kubernetes.io/projected/3bfed0ca-af76-4ba2-8be4-84716902175b-kube-api-access-gwvf9\") pod \"3bfed0ca-af76-4ba2-8be4-84716902175b\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.603858 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-combined-ca-bundle\") pod \"3bfed0ca-af76-4ba2-8be4-84716902175b\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.603902 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-config-data\") pod \"3bfed0ca-af76-4ba2-8be4-84716902175b\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.603940 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bfed0ca-af76-4ba2-8be4-84716902175b-logs\") pod \"3bfed0ca-af76-4ba2-8be4-84716902175b\" (UID: \"3bfed0ca-af76-4ba2-8be4-84716902175b\") " Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.607142 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bfed0ca-af76-4ba2-8be4-84716902175b-logs" (OuterVolumeSpecName: "logs") pod "3bfed0ca-af76-4ba2-8be4-84716902175b" (UID: "3bfed0ca-af76-4ba2-8be4-84716902175b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.607632 4907 scope.go:117] "RemoveContainer" containerID="d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a" Feb 26 16:08:08 crc kubenswrapper[4907]: E0226 16:08:08.610052 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a\": container with ID starting with d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a not found: ID does not exist" containerID="d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.610098 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a"} err="failed to get container status \"d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a\": rpc error: code = NotFound desc = could not find container \"d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a\": container with ID starting with d2efd4ff7763d97659db228608c15cc4b06a951aed230441f4daf616e6caf66a not found: ID does not exist" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.610117 4907 scope.go:117] "RemoveContainer" containerID="230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245" Feb 26 16:08:08 crc kubenswrapper[4907]: E0226 16:08:08.610431 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245\": container with ID starting with 230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245 not found: ID does not exist" containerID="230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.610450 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245"} err="failed to get container status \"230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245\": rpc error: code = NotFound desc = could not find container \"230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245\": container with ID starting with 230493b78013cd174cf17a272666a1c37a2d864e3a716f3e4fa3c2dc70f27245 not found: ID does not exist" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.611635 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfed0ca-af76-4ba2-8be4-84716902175b-kube-api-access-gwvf9" (OuterVolumeSpecName: "kube-api-access-gwvf9") pod "3bfed0ca-af76-4ba2-8be4-84716902175b" (UID: "3bfed0ca-af76-4ba2-8be4-84716902175b"). InnerVolumeSpecName "kube-api-access-gwvf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.638328 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-config-data" (OuterVolumeSpecName: "config-data") pod "3bfed0ca-af76-4ba2-8be4-84716902175b" (UID: "3bfed0ca-af76-4ba2-8be4-84716902175b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.640224 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bfed0ca-af76-4ba2-8be4-84716902175b" (UID: "3bfed0ca-af76-4ba2-8be4-84716902175b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.665837 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3bfed0ca-af76-4ba2-8be4-84716902175b" (UID: "3bfed0ca-af76-4ba2-8be4-84716902175b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.705844 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.705873 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwvf9\" (UniqueName: \"kubernetes.io/projected/3bfed0ca-af76-4ba2-8be4-84716902175b-kube-api-access-gwvf9\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.705883 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.705893 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfed0ca-af76-4ba2-8be4-84716902175b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.705902 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3bfed0ca-af76-4ba2-8be4-84716902175b-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.869393 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.877604 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.907832 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:08:08 crc kubenswrapper[4907]: E0226 16:08:08.908235 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-log" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.908252 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-log" Feb 26 16:08:08 crc kubenswrapper[4907]: E0226 16:08:08.908261 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-metadata" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.908269 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-metadata" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.908472 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-log" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.908492 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" containerName="nova-metadata-metadata" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.909383 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.914011 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.914154 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 16:08:08 crc kubenswrapper[4907]: I0226 16:08:08.919116 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.012548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.012626 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-logs\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.012662 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-config-data\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.012688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.012712 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg2q8\" (UniqueName: \"kubernetes.io/projected/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-kube-api-access-mg2q8\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.114896 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.114989 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-logs\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.115040 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-config-data\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.115079 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.115113 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg2q8\" (UniqueName: \"kubernetes.io/projected/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-kube-api-access-mg2q8\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.116229 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-logs\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.120254 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-config-data\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.122216 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.123060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.150819 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg2q8\" (UniqueName: \"kubernetes.io/projected/30cde741-a6c4-485b-9ff4-ee2da1ffb88c-kube-api-access-mg2q8\") pod \"nova-metadata-0\" (UID: \"30cde741-a6c4-485b-9ff4-ee2da1ffb88c\") " pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.230268 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.556030 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c994f627-1f02-468c-9651-19ac6a8728b4","Type":"ContainerStarted","Data":"7bf019f1b48994b7cf1dcb612ad72bce98867374ea529a03581ce3eb3a923647"} Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.576670 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.576649774 podStartE2EDuration="2.576649774s" podCreationTimestamp="2026-02-26 16:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:08:09.568043793 +0000 UTC m=+1552.086605652" watchObservedRunningTime="2026-02-26 16:08:09.576649774 +0000 UTC m=+1552.095211623" Feb 26 16:08:09 crc kubenswrapper[4907]: I0226 16:08:09.692033 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 16:08:09 crc kubenswrapper[4907]: W0226 16:08:09.695536 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30cde741_a6c4_485b_9ff4_ee2da1ffb88c.slice/crio-596505a8581d3166126adb73ddfcfcad6bb44b083b0d191e087ec5b3a773185e WatchSource:0}: Error finding container 596505a8581d3166126adb73ddfcfcad6bb44b083b0d191e087ec5b3a773185e: Status 404 returned error can't find the container with id 596505a8581d3166126adb73ddfcfcad6bb44b083b0d191e087ec5b3a773185e Feb 26 16:08:10 crc kubenswrapper[4907]: I0226 16:08:10.140480 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bfed0ca-af76-4ba2-8be4-84716902175b" path="/var/lib/kubelet/pods/3bfed0ca-af76-4ba2-8be4-84716902175b/volumes" Feb 26 16:08:10 crc kubenswrapper[4907]: I0226 16:08:10.582362 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30cde741-a6c4-485b-9ff4-ee2da1ffb88c","Type":"ContainerStarted","Data":"ead14d68e187e37085bdb8046fedb763cbe3b650282c01df55358895c2ce8515"} Feb 26 16:08:10 crc kubenswrapper[4907]: I0226 16:08:10.582461 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30cde741-a6c4-485b-9ff4-ee2da1ffb88c","Type":"ContainerStarted","Data":"ba62bc03b3f550f88920185b64855eddffe8bd30bdcb6c94172978c05410e3bb"} Feb 26 16:08:10 crc kubenswrapper[4907]: I0226 16:08:10.582484 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"30cde741-a6c4-485b-9ff4-ee2da1ffb88c","Type":"ContainerStarted","Data":"596505a8581d3166126adb73ddfcfcad6bb44b083b0d191e087ec5b3a773185e"} Feb 26 16:08:10 crc kubenswrapper[4907]: I0226 16:08:10.613089 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.61306956 podStartE2EDuration="2.61306956s" podCreationTimestamp="2026-02-26 16:08:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:08:10.609942054 +0000 UTC m=+1553.128503903" watchObservedRunningTime="2026-02-26 16:08:10.61306956 +0000 UTC m=+1553.131631409" Feb 26 16:08:12 crc kubenswrapper[4907]: I0226 16:08:12.897209 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 16:08:14 crc kubenswrapper[4907]: I0226 16:08:14.231253 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:08:14 crc kubenswrapper[4907]: I0226 16:08:14.231304 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.144687 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sqrbr"] Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.147416 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.161816 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sqrbr"] Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.242073 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-catalog-content\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.242177 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj4gx\" (UniqueName: \"kubernetes.io/projected/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-kube-api-access-nj4gx\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.242340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-utilities\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.344818 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-catalog-content\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.344931 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj4gx\" (UniqueName: \"kubernetes.io/projected/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-kube-api-access-nj4gx\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.345136 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-utilities\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.345351 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-catalog-content\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.345793 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-utilities\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.370077 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj4gx\" (UniqueName: \"kubernetes.io/projected/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-kube-api-access-nj4gx\") pod \"redhat-operators-sqrbr\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.465425 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.659791 4907 generic.go:334] "Generic (PLEG): container finished" podID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerID="3f95094dd73a53aa831d3c7f002970271a280a470ee37d101788fd1290991f04" exitCode=137 Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.659851 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerDied","Data":"3f95094dd73a53aa831d3c7f002970271a280a470ee37d101788fd1290991f04"} Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.729471 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896000 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-secret-key\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896141 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-scripts\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896170 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-tls-certs\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896257 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-config-data\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896282 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgd7r\" (UniqueName: \"kubernetes.io/projected/911d5df8-d8e2-4552-9c75-33c5ab72646b-kube-api-access-fgd7r\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896329 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911d5df8-d8e2-4552-9c75-33c5ab72646b-logs\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.896369 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-combined-ca-bundle\") pod \"911d5df8-d8e2-4552-9c75-33c5ab72646b\" (UID: \"911d5df8-d8e2-4552-9c75-33c5ab72646b\") " Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.899299 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911d5df8-d8e2-4552-9c75-33c5ab72646b-logs" (OuterVolumeSpecName: "logs") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.902915 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.905191 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911d5df8-d8e2-4552-9c75-33c5ab72646b-kube-api-access-fgd7r" (OuterVolumeSpecName: "kube-api-access-fgd7r") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "kube-api-access-fgd7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.926101 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-scripts" (OuterVolumeSpecName: "scripts") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.927158 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-config-data" (OuterVolumeSpecName: "config-data") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.929080 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.950511 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "911d5df8-d8e2-4552-9c75-33c5ab72646b" (UID: "911d5df8-d8e2-4552-9c75-33c5ab72646b"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998843 4907 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998878 4907 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998890 4907 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998901 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/911d5df8-d8e2-4552-9c75-33c5ab72646b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998913 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgd7r\" (UniqueName: \"kubernetes.io/projected/911d5df8-d8e2-4552-9c75-33c5ab72646b-kube-api-access-fgd7r\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998926 4907 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911d5df8-d8e2-4552-9c75-33c5ab72646b-logs\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:15 crc kubenswrapper[4907]: I0226 16:08:15.998936 4907 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911d5df8-d8e2-4552-9c75-33c5ab72646b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.047115 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sqrbr"] Feb 26 16:08:16 crc kubenswrapper[4907]: W0226 16:08:16.048669 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6d9f98_d446_4b48_bd17_1c6c3ab80460.slice/crio-6a2a66f2152967ccdb479707b45fcfad2a13f826d93a3c2d94bb166133ae85f8 WatchSource:0}: Error finding container 6a2a66f2152967ccdb479707b45fcfad2a13f826d93a3c2d94bb166133ae85f8: Status 404 returned error can't find the container with id 6a2a66f2152967ccdb479707b45fcfad2a13f826d93a3c2d94bb166133ae85f8 Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.670572 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fccfb8496-4tqhr" Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.670578 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fccfb8496-4tqhr" event={"ID":"911d5df8-d8e2-4552-9c75-33c5ab72646b","Type":"ContainerDied","Data":"a10277d73a2ffb2051463a8d07d910b9357b81428ff49db09862fcbccced53ff"} Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.671154 4907 scope.go:117] "RemoveContainer" containerID="9e39d9243d4cdfe57e174b7503dc46aaf0fae6d591c7d87a7c2c19a92a84a500" Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.672912 4907 generic.go:334] "Generic (PLEG): container finished" podID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerID="9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313" exitCode=0 Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.672953 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerDied","Data":"9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313"} Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.672983 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerStarted","Data":"6a2a66f2152967ccdb479707b45fcfad2a13f826d93a3c2d94bb166133ae85f8"} Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.704520 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fccfb8496-4tqhr"] Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.713671 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fccfb8496-4tqhr"] Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.886542 4907 scope.go:117] "RemoveContainer" containerID="3f95094dd73a53aa831d3c7f002970271a280a470ee37d101788fd1290991f04" Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.976102 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:08:16 crc kubenswrapper[4907]: I0226 16:08:16.976147 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 16:08:17 crc kubenswrapper[4907]: I0226 16:08:17.685894 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerStarted","Data":"e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562"} Feb 26 16:08:17 crc kubenswrapper[4907]: I0226 16:08:17.897260 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 16:08:17 crc kubenswrapper[4907]: I0226 16:08:17.925347 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 16:08:17 crc kubenswrapper[4907]: I0226 16:08:17.989810 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="674c61cb-49ef-4710-b83f-0374acf42f6a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:08:17 crc kubenswrapper[4907]: I0226 16:08:17.990103 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="674c61cb-49ef-4710-b83f-0374acf42f6a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.213:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:08:18 crc kubenswrapper[4907]: I0226 16:08:18.139513 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" path="/var/lib/kubelet/pods/911d5df8-d8e2-4552-9c75-33c5ab72646b/volumes" Feb 26 16:08:18 crc kubenswrapper[4907]: I0226 16:08:18.530711 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:08:18 crc kubenswrapper[4907]: I0226 16:08:18.531114 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:08:18 crc kubenswrapper[4907]: I0226 16:08:18.738153 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 16:08:19 crc kubenswrapper[4907]: I0226 16:08:19.230639 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:08:19 crc kubenswrapper[4907]: I0226 16:08:19.231053 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 16:08:20 crc kubenswrapper[4907]: I0226 16:08:20.243767 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="30cde741-a6c4-485b-9ff4-ee2da1ffb88c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:08:20 crc kubenswrapper[4907]: I0226 16:08:20.243791 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="30cde741-a6c4-485b-9ff4-ee2da1ffb88c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.215:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 16:08:24 crc kubenswrapper[4907]: I0226 16:08:24.920230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 16:08:26 crc kubenswrapper[4907]: I0226 16:08:26.781536 4907 generic.go:334] "Generic (PLEG): container finished" podID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerID="e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562" exitCode=0 Feb 26 16:08:26 crc kubenswrapper[4907]: I0226 16:08:26.781685 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerDied","Data":"e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562"} Feb 26 16:08:26 crc kubenswrapper[4907]: I0226 16:08:26.984654 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:08:26 crc kubenswrapper[4907]: I0226 16:08:26.985273 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 16:08:26 crc kubenswrapper[4907]: I0226 16:08:26.985979 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:08:26 crc kubenswrapper[4907]: I0226 16:08:26.999938 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:08:27 crc kubenswrapper[4907]: I0226 16:08:27.796791 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerStarted","Data":"664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8"} Feb 26 16:08:27 crc kubenswrapper[4907]: I0226 16:08:27.797195 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 16:08:27 crc kubenswrapper[4907]: I0226 16:08:27.825000 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sqrbr" podStartSLOduration=2.318348983 podStartE2EDuration="12.824981633s" podCreationTimestamp="2026-02-26 16:08:15 +0000 UTC" firstStartedPulling="2026-02-26 16:08:16.674486924 +0000 UTC m=+1559.193048813" lastFinishedPulling="2026-02-26 16:08:27.181119614 +0000 UTC m=+1569.699681463" observedRunningTime="2026-02-26 16:08:27.816801742 +0000 UTC m=+1570.335363591" watchObservedRunningTime="2026-02-26 16:08:27.824981633 +0000 UTC m=+1570.343543482" Feb 26 16:08:27 crc kubenswrapper[4907]: I0226 16:08:27.826317 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 16:08:29 crc kubenswrapper[4907]: I0226 16:08:29.237345 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 16:08:29 crc kubenswrapper[4907]: I0226 16:08:29.238647 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 16:08:29 crc kubenswrapper[4907]: I0226 16:08:29.243621 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 16:08:29 crc kubenswrapper[4907]: I0226 16:08:29.248214 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 16:08:35 crc kubenswrapper[4907]: I0226 16:08:35.467493 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:35 crc kubenswrapper[4907]: I0226 16:08:35.468550 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:08:37 crc kubenswrapper[4907]: I0226 16:08:37.384956 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sqrbr" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" probeResult="failure" output=< Feb 26 16:08:37 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:08:37 crc kubenswrapper[4907]: > Feb 26 16:08:37 crc kubenswrapper[4907]: I0226 16:08:37.788207 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:08:38 crc kubenswrapper[4907]: I0226 16:08:38.915496 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:08:42 crc kubenswrapper[4907]: I0226 16:08:42.974157 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="96ba881c-449c-4300-b67f-8a1e952af508" containerName="rabbitmq" containerID="cri-o://1acc9ffabff45e7a23fbd242599d1693e106acd3557c6aa619db091bb41fc243" gracePeriod=604795 Feb 26 16:08:43 crc kubenswrapper[4907]: I0226 16:08:43.754700 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerName="rabbitmq" containerID="cri-o://e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3" gracePeriod=604796 Feb 26 16:08:46 crc kubenswrapper[4907]: I0226 16:08:46.514136 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sqrbr" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" probeResult="failure" output=< Feb 26 16:08:46 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:08:46 crc kubenswrapper[4907]: > Feb 26 16:08:48 crc kubenswrapper[4907]: I0226 16:08:48.530130 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:08:48 crc kubenswrapper[4907]: I0226 16:08:48.530456 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:08:48 crc kubenswrapper[4907]: I0226 16:08:48.530505 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:08:48 crc kubenswrapper[4907]: I0226 16:08:48.531087 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:08:48 crc kubenswrapper[4907]: I0226 16:08:48.531163 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" gracePeriod=600 Feb 26 16:08:48 crc kubenswrapper[4907]: E0226 16:08:48.657459 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.459843 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" exitCode=0 Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.460052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46"} Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.460309 4907 scope.go:117] "RemoveContainer" containerID="39faa61e9e899f01de0dcddf00d83aac761ca87f8fd53bc6d256f2980199847a" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.460963 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:08:49 crc kubenswrapper[4907]: E0226 16:08:49.461321 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.465491 4907 generic.go:334] "Generic (PLEG): container finished" podID="96ba881c-449c-4300-b67f-8a1e952af508" containerID="1acc9ffabff45e7a23fbd242599d1693e106acd3557c6aa619db091bb41fc243" exitCode=0 Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.465529 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96ba881c-449c-4300-b67f-8a1e952af508","Type":"ContainerDied","Data":"1acc9ffabff45e7a23fbd242599d1693e106acd3557c6aa619db091bb41fc243"} Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.632519 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757561 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-plugins-conf\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757616 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-server-conf\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757668 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757711 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96ba881c-449c-4300-b67f-8a1e952af508-pod-info\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757778 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-tls\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757794 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-config-data\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757816 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-erlang-cookie\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757863 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96ba881c-449c-4300-b67f-8a1e952af508-erlang-cookie-secret\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757881 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-plugins\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757959 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-confd\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.757974 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98k2n\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-kube-api-access-98k2n\") pod \"96ba881c-449c-4300-b67f-8a1e952af508\" (UID: \"96ba881c-449c-4300-b67f-8a1e952af508\") " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.758878 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.771724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96ba881c-449c-4300-b67f-8a1e952af508-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.795058 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.796298 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.796534 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.796813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-kube-api-access-98k2n" (OuterVolumeSpecName: "kube-api-access-98k2n") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "kube-api-access-98k2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.797482 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.811977 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/96ba881c-449c-4300-b67f-8a1e952af508-pod-info" (OuterVolumeSpecName: "pod-info") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869198 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869244 4907 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96ba881c-449c-4300-b67f-8a1e952af508-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869257 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869270 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869283 4907 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96ba881c-449c-4300-b67f-8a1e952af508-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869293 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869306 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98k2n\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-kube-api-access-98k2n\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.869317 4907 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.873027 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-config-data" (OuterVolumeSpecName: "config-data") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.898317 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-server-conf" (OuterVolumeSpecName: "server-conf") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.926075 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.973335 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.973368 4907 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96ba881c-449c-4300-b67f-8a1e952af508-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:49 crc kubenswrapper[4907]: I0226 16:08:49.973377 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.044880 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "96ba881c-449c-4300-b67f-8a1e952af508" (UID: "96ba881c-449c-4300-b67f-8a1e952af508"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.075066 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96ba881c-449c-4300-b67f-8a1e952af508-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.297928 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379515 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cca4ff23-cabb-466c-80a0-dbcc1f005123-erlang-cookie-secret\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379576 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-plugins\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379638 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-config-data\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379680 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-server-conf\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379741 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cca4ff23-cabb-466c-80a0-dbcc1f005123-pod-info\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379780 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-erlang-cookie\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379824 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-confd\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379847 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379881 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6lgm\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-kube-api-access-k6lgm\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379944 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-plugins-conf\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.379969 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-tls\") pod \"cca4ff23-cabb-466c-80a0-dbcc1f005123\" (UID: \"cca4ff23-cabb-466c-80a0-dbcc1f005123\") " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.384991 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.386745 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.388311 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.394015 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.403018 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/cca4ff23-cabb-466c-80a0-dbcc1f005123-pod-info" (OuterVolumeSpecName: "pod-info") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.403284 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.409692 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-kube-api-access-k6lgm" (OuterVolumeSpecName: "kube-api-access-k6lgm") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "kube-api-access-k6lgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.409873 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cca4ff23-cabb-466c-80a0-dbcc1f005123-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.484904 4907 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cca4ff23-cabb-466c-80a0-dbcc1f005123-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.484935 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.484947 4907 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cca4ff23-cabb-466c-80a0-dbcc1f005123-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.484966 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.484997 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.485010 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6lgm\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-kube-api-access-k6lgm\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.485024 4907 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.485032 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.522885 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-config-data" (OuterVolumeSpecName: "config-data") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.523904 4907 generic.go:334] "Generic (PLEG): container finished" podID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerID="e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3" exitCode=0 Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.523961 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cca4ff23-cabb-466c-80a0-dbcc1f005123","Type":"ContainerDied","Data":"e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3"} Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.523987 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cca4ff23-cabb-466c-80a0-dbcc1f005123","Type":"ContainerDied","Data":"5558e16d18eb38160b895fd9b45060360bf46597d980c8545363832abe43461f"} Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.524004 4907 scope.go:117] "RemoveContainer" containerID="e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.524097 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.530416 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-server-conf" (OuterVolumeSpecName: "server-conf") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.551870 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.575982 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96ba881c-449c-4300-b67f-8a1e952af508","Type":"ContainerDied","Data":"f30b4ab8c4da28e28cac478e136dd20082245886273226ca75977b1b06a3ebe1"} Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.576071 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.596335 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.596375 4907 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cca4ff23-cabb-466c-80a0-dbcc1f005123-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.596389 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.602561 4907 scope.go:117] "RemoveContainer" containerID="ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.635849 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.658279 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.666026 4907 scope.go:117] "RemoveContainer" containerID="e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.666069 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "cca4ff23-cabb-466c-80a0-dbcc1f005123" (UID: "cca4ff23-cabb-466c-80a0-dbcc1f005123"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.669762 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3\": container with ID starting with e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3 not found: ID does not exist" containerID="e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.669828 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3"} err="failed to get container status \"e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3\": rpc error: code = NotFound desc = could not find container \"e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3\": container with ID starting with e0d0ba16cf9a0991250fbe1f4375f1fae0b7663840e5aacdc81be3fdd3afd8e3 not found: ID does not exist" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.669852 4907 scope.go:117] "RemoveContainer" containerID="ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.672220 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77\": container with ID starting with ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77 not found: ID does not exist" containerID="ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.672254 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77"} err="failed to get container status \"ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77\": rpc error: code = NotFound desc = could not find container \"ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77\": container with ID starting with ebbb6bfe7182e9cd90b17c87be7c4962d1f1d25ab0a2722c9407b04029ac9d77 not found: ID does not exist" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.672275 4907 scope.go:117] "RemoveContainer" containerID="1acc9ffabff45e7a23fbd242599d1693e106acd3557c6aa619db091bb41fc243" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.681188 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.681544 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ba881c-449c-4300-b67f-8a1e952af508" containerName="rabbitmq" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.681559 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ba881c-449c-4300-b67f-8a1e952af508" containerName="rabbitmq" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.689615 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.689661 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.689685 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerName="setup-container" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.689692 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerName="setup-container" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.689736 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ba881c-449c-4300-b67f-8a1e952af508" containerName="setup-container" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.689742 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ba881c-449c-4300-b67f-8a1e952af508" containerName="setup-container" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.689761 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon-log" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.689768 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon-log" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.689784 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerName="rabbitmq" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.689792 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerName="rabbitmq" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.689816 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.689823 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690099 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" containerName="rabbitmq" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690113 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ba881c-449c-4300-b67f-8a1e952af508" containerName="rabbitmq" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690125 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690137 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690147 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690158 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon-log" Feb 26 16:08:50 crc kubenswrapper[4907]: E0226 16:08:50.690310 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.690322 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="911d5df8-d8e2-4552-9c75-33c5ab72646b" containerName="horizon" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.691068 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.704813 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.705277 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.705399 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-ptp6b" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.705994 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.706146 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.706474 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.719206 4907 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cca4ff23-cabb-466c-80a0-dbcc1f005123-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.726924 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.732476 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.751574 4907 scope.go:117] "RemoveContainer" containerID="e28a3b8c761243a769d04d190d2ae365641bcbb802321434379486662fc95053" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925548 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-config-data\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925606 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925659 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925730 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925790 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925824 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925856 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86db8\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-kube-api-access-86db8\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925877 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.925970 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.926038 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20078d55-ee5c-4818-9ff9-4089683c9729-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.926064 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20078d55-ee5c-4818-9ff9-4089683c9729-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.950644 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.957268 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.973482 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.975241 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.979311 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.979510 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.979669 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kqxnc" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.979776 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.979825 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.980052 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.985106 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 16:08:50 crc kubenswrapper[4907]: I0226 16:08:50.990948 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027428 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027505 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027550 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027581 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027631 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86db8\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-kube-api-access-86db8\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027651 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027712 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027738 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20078d55-ee5c-4818-9ff9-4089683c9729-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027761 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20078d55-ee5c-4818-9ff9-4089683c9729-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027780 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-config-data\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.027802 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.028877 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.030788 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-config-data\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.031283 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.033725 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20078d55-ee5c-4818-9ff9-4089683c9729-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.035134 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.035377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.048474 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20078d55-ee5c-4818-9ff9-4089683c9729-server-conf\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.050190 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.074000 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20078d55-ee5c-4818-9ff9-4089683c9729-pod-info\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.078283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.084238 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86db8\" (UniqueName: \"kubernetes.io/projected/20078d55-ee5c-4818-9ff9-4089683c9729-kube-api-access-86db8\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.114250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"20078d55-ee5c-4818-9ff9-4089683c9729\") " pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.128860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.128913 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbc69627-1691-43df-a77a-ca3e26e67aaa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.128950 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.128966 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129206 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129226 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129279 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129295 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b44d2\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-kube-api-access-b44d2\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129346 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbc69627-1691-43df-a77a-ca3e26e67aaa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129367 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.129439 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231435 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231723 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b44d2\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-kube-api-access-b44d2\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231745 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbc69627-1691-43df-a77a-ca3e26e67aaa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231773 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231837 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231876 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231951 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbc69627-1691-43df-a77a-ca3e26e67aaa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231966 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.231981 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.232060 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.232077 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.232495 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.233633 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.234510 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.234795 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.235065 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.235582 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.236007 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/cbc69627-1691-43df-a77a-ca3e26e67aaa-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.239412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.240412 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/cbc69627-1691-43df-a77a-ca3e26e67aaa-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.241287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/cbc69627-1691-43df-a77a-ca3e26e67aaa-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.253694 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b44d2\" (UniqueName: \"kubernetes.io/projected/cbc69627-1691-43df-a77a-ca3e26e67aaa-kube-api-access-b44d2\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.272305 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"cbc69627-1691-43df-a77a-ca3e26e67aaa\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.295244 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.342854 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.379888 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-zxcrp"] Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.381494 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.385228 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.396780 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-zxcrp"] Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.449820 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-config\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.449862 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-svc\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.449907 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9zpq\" (UniqueName: \"kubernetes.io/projected/d7461168-b46b-48da-ace3-8f02feb49468-kube-api-access-w9zpq\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.449958 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.449981 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.450028 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.450047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553006 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9zpq\" (UniqueName: \"kubernetes.io/projected/d7461168-b46b-48da-ace3-8f02feb49468-kube-api-access-w9zpq\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553367 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553394 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553442 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553467 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553551 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-config\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.553582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-svc\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.554472 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.554651 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-svc\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.555088 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-config\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.555496 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.555887 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.556392 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.588522 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9zpq\" (UniqueName: \"kubernetes.io/projected/d7461168-b46b-48da-ace3-8f02feb49468-kube-api-access-w9zpq\") pod \"dnsmasq-dns-d558885bc-zxcrp\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.705820 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:51 crc kubenswrapper[4907]: I0226 16:08:51.939430 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.009840 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.188020 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ba881c-449c-4300-b67f-8a1e952af508" path="/var/lib/kubelet/pods/96ba881c-449c-4300-b67f-8a1e952af508/volumes" Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.190899 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cca4ff23-cabb-466c-80a0-dbcc1f005123" path="/var/lib/kubelet/pods/cca4ff23-cabb-466c-80a0-dbcc1f005123/volumes" Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.253190 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-zxcrp"] Feb 26 16:08:52 crc kubenswrapper[4907]: W0226 16:08:52.269833 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7461168_b46b_48da_ace3_8f02feb49468.slice/crio-a7e840b2b4f85fa4123e00a0e223ba92219273353e329177960b7060899c0f8f WatchSource:0}: Error finding container a7e840b2b4f85fa4123e00a0e223ba92219273353e329177960b7060899c0f8f: Status 404 returned error can't find the container with id a7e840b2b4f85fa4123e00a0e223ba92219273353e329177960b7060899c0f8f Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.606422 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7461168-b46b-48da-ace3-8f02feb49468" containerID="0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713" exitCode=0 Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.606517 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" event={"ID":"d7461168-b46b-48da-ace3-8f02feb49468","Type":"ContainerDied","Data":"0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713"} Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.606548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" event={"ID":"d7461168-b46b-48da-ace3-8f02feb49468","Type":"ContainerStarted","Data":"a7e840b2b4f85fa4123e00a0e223ba92219273353e329177960b7060899c0f8f"} Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.607565 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20078d55-ee5c-4818-9ff9-4089683c9729","Type":"ContainerStarted","Data":"f5fa97bee12727f0b2ceb44759d2070cdc8e1c34bb1a6beb6b96c6d00107f2d5"} Feb 26 16:08:52 crc kubenswrapper[4907]: I0226 16:08:52.613289 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cbc69627-1691-43df-a77a-ca3e26e67aaa","Type":"ContainerStarted","Data":"f2bd4947c532186ffe3eb0ba582165784f6307d22ae63886d56d548ddf75c5b6"} Feb 26 16:08:53 crc kubenswrapper[4907]: I0226 16:08:53.622659 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20078d55-ee5c-4818-9ff9-4089683c9729","Type":"ContainerStarted","Data":"9db1180a5b788d55ed4b83bc22fc255fd7ef0c4fe20b53a1c4cddd51d73850d5"} Feb 26 16:08:53 crc kubenswrapper[4907]: I0226 16:08:53.625477 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cbc69627-1691-43df-a77a-ca3e26e67aaa","Type":"ContainerStarted","Data":"cfa426cc7a88874ef8ecfe8ba9e28030e4c01f362780915f57f7773545098985"} Feb 26 16:08:53 crc kubenswrapper[4907]: I0226 16:08:53.627561 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" event={"ID":"d7461168-b46b-48da-ace3-8f02feb49468","Type":"ContainerStarted","Data":"2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c"} Feb 26 16:08:53 crc kubenswrapper[4907]: I0226 16:08:53.628063 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:08:53 crc kubenswrapper[4907]: I0226 16:08:53.745346 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" podStartSLOduration=2.745329273 podStartE2EDuration="2.745329273s" podCreationTimestamp="2026-02-26 16:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:08:53.741476368 +0000 UTC m=+1596.260038217" watchObservedRunningTime="2026-02-26 16:08:53.745329273 +0000 UTC m=+1596.263891122" Feb 26 16:08:56 crc kubenswrapper[4907]: I0226 16:08:56.517569 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sqrbr" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" probeResult="failure" output=< Feb 26 16:08:56 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:08:56 crc kubenswrapper[4907]: > Feb 26 16:09:00 crc kubenswrapper[4907]: I0226 16:09:00.126470 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:09:00 crc kubenswrapper[4907]: E0226 16:09:00.127059 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:09:01 crc kubenswrapper[4907]: I0226 16:09:01.708752 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:09:01 crc kubenswrapper[4907]: I0226 16:09:01.789341 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-chsg6"] Feb 26 16:09:01 crc kubenswrapper[4907]: I0226 16:09:01.789575 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="dnsmasq-dns" containerID="cri-o://ced4fc2d497ca439acbf014da0f32226afff2cd2aedba7c0f85f2022046ae4ce" gracePeriod=10 Feb 26 16:09:01 crc kubenswrapper[4907]: I0226 16:09:01.995021 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-thjms"] Feb 26 16:09:01 crc kubenswrapper[4907]: I0226 16:09:01.996759 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.004538 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-thjms"] Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.068106 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.068462 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.069168 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlb8n\" (UniqueName: \"kubernetes.io/projected/c0ee4ec2-b0e1-4927-9258-df237432c628-kube-api-access-mlb8n\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.069233 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.069325 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-config\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.069387 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.069565 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.172639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.175171 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-config\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.175062 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.176232 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.176494 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-config\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.177056 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-dns-svc\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.178278 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.178381 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.178445 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.178560 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlb8n\" (UniqueName: \"kubernetes.io/projected/c0ee4ec2-b0e1-4927-9258-df237432c628-kube-api-access-mlb8n\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.179738 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-openstack-edpm-ipam\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.181213 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.181881 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ee4ec2-b0e1-4927-9258-df237432c628-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.654639 4907 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.207:5353: connect: connection refused" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.659628 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlb8n\" (UniqueName: \"kubernetes.io/projected/c0ee4ec2-b0e1-4927-9258-df237432c628-kube-api-access-mlb8n\") pod \"dnsmasq-dns-67cb876dc9-thjms\" (UID: \"c0ee4ec2-b0e1-4927-9258-df237432c628\") " pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.709239 4907 generic.go:334] "Generic (PLEG): container finished" podID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerID="ced4fc2d497ca439acbf014da0f32226afff2cd2aedba7c0f85f2022046ae4ce" exitCode=0 Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.709280 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" event={"ID":"0ac59eb1-73ee-4a73-90bd-2273f03c9498","Type":"ContainerDied","Data":"ced4fc2d497ca439acbf014da0f32226afff2cd2aedba7c0f85f2022046ae4ce"} Feb 26 16:09:02 crc kubenswrapper[4907]: I0226 16:09:02.917764 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.519521 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb876dc9-thjms"] Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.717474 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" event={"ID":"c0ee4ec2-b0e1-4927-9258-df237432c628","Type":"ContainerStarted","Data":"90687f9628ab058c3c989ab599e2760eac17a12f3899358596bd72543e2c7030"} Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.719710 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" event={"ID":"0ac59eb1-73ee-4a73-90bd-2273f03c9498","Type":"ContainerDied","Data":"2db045074b31745129cb37d5ea0feebdf082577d83dccdfc2f1d94015c18aa37"} Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.719739 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2db045074b31745129cb37d5ea0feebdf082577d83dccdfc2f1d94015c18aa37" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.727430 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.811704 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpqx5\" (UniqueName: \"kubernetes.io/projected/0ac59eb1-73ee-4a73-90bd-2273f03c9498-kube-api-access-hpqx5\") pod \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.812053 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-nb\") pod \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.812211 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-config\") pod \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.812236 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-swift-storage-0\") pod \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.812560 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-svc\") pod \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.812716 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-sb\") pod \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\" (UID: \"0ac59eb1-73ee-4a73-90bd-2273f03c9498\") " Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.823521 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac59eb1-73ee-4a73-90bd-2273f03c9498-kube-api-access-hpqx5" (OuterVolumeSpecName: "kube-api-access-hpqx5") pod "0ac59eb1-73ee-4a73-90bd-2273f03c9498" (UID: "0ac59eb1-73ee-4a73-90bd-2273f03c9498"). InnerVolumeSpecName "kube-api-access-hpqx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.878492 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ac59eb1-73ee-4a73-90bd-2273f03c9498" (UID: "0ac59eb1-73ee-4a73-90bd-2273f03c9498"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.879699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ac59eb1-73ee-4a73-90bd-2273f03c9498" (UID: "0ac59eb1-73ee-4a73-90bd-2273f03c9498"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.882334 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ac59eb1-73ee-4a73-90bd-2273f03c9498" (UID: "0ac59eb1-73ee-4a73-90bd-2273f03c9498"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.885141 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-config" (OuterVolumeSpecName: "config") pod "0ac59eb1-73ee-4a73-90bd-2273f03c9498" (UID: "0ac59eb1-73ee-4a73-90bd-2273f03c9498"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.890174 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0ac59eb1-73ee-4a73-90bd-2273f03c9498" (UID: "0ac59eb1-73ee-4a73-90bd-2273f03c9498"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.915660 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpqx5\" (UniqueName: \"kubernetes.io/projected/0ac59eb1-73ee-4a73-90bd-2273f03c9498-kube-api-access-hpqx5\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.915694 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.915705 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.915714 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.915722 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:03 crc kubenswrapper[4907]: I0226 16:09:03.915729 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ac59eb1-73ee-4a73-90bd-2273f03c9498-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:04 crc kubenswrapper[4907]: I0226 16:09:04.729709 4907 generic.go:334] "Generic (PLEG): container finished" podID="c0ee4ec2-b0e1-4927-9258-df237432c628" containerID="de42b5b9d11cebb78c3c19e97e042fb751a968a036244a1593f886e7b539834f" exitCode=0 Feb 26 16:09:04 crc kubenswrapper[4907]: I0226 16:09:04.729755 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" event={"ID":"c0ee4ec2-b0e1-4927-9258-df237432c628","Type":"ContainerDied","Data":"de42b5b9d11cebb78c3c19e97e042fb751a968a036244a1593f886e7b539834f"} Feb 26 16:09:04 crc kubenswrapper[4907]: I0226 16:09:04.729865 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-chsg6" Feb 26 16:09:04 crc kubenswrapper[4907]: I0226 16:09:04.828146 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-chsg6"] Feb 26 16:09:04 crc kubenswrapper[4907]: I0226 16:09:04.839721 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-chsg6"] Feb 26 16:09:05 crc kubenswrapper[4907]: I0226 16:09:05.741881 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" event={"ID":"c0ee4ec2-b0e1-4927-9258-df237432c628","Type":"ContainerStarted","Data":"a9cff4d4541096ccb8e586cf27141b02e3d77dfc3d0c1394c103e17f17515288"} Feb 26 16:09:05 crc kubenswrapper[4907]: I0226 16:09:05.742232 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:05 crc kubenswrapper[4907]: I0226 16:09:05.774516 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" podStartSLOduration=4.774491003 podStartE2EDuration="4.774491003s" podCreationTimestamp="2026-02-26 16:09:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:09:05.767184394 +0000 UTC m=+1608.285746253" watchObservedRunningTime="2026-02-26 16:09:05.774491003 +0000 UTC m=+1608.293052872" Feb 26 16:09:06 crc kubenswrapper[4907]: I0226 16:09:06.136125 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" path="/var/lib/kubelet/pods/0ac59eb1-73ee-4a73-90bd-2273f03c9498/volumes" Feb 26 16:09:06 crc kubenswrapper[4907]: I0226 16:09:06.518837 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sqrbr" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" probeResult="failure" output=< Feb 26 16:09:06 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:09:06 crc kubenswrapper[4907]: > Feb 26 16:09:12 crc kubenswrapper[4907]: I0226 16:09:12.919882 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67cb876dc9-thjms" Feb 26 16:09:12 crc kubenswrapper[4907]: I0226 16:09:12.990994 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-zxcrp"] Feb 26 16:09:12 crc kubenswrapper[4907]: I0226 16:09:12.991502 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" podUID="d7461168-b46b-48da-ace3-8f02feb49468" containerName="dnsmasq-dns" containerID="cri-o://2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c" gracePeriod=10 Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.509431 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.607965 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-svc\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.608038 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-nb\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.608066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-openstack-edpm-ipam\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.608133 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-swift-storage-0\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.608185 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9zpq\" (UniqueName: \"kubernetes.io/projected/d7461168-b46b-48da-ace3-8f02feb49468-kube-api-access-w9zpq\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.608238 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-config\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.608281 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-sb\") pod \"d7461168-b46b-48da-ace3-8f02feb49468\" (UID: \"d7461168-b46b-48da-ace3-8f02feb49468\") " Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.621862 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7461168-b46b-48da-ace3-8f02feb49468-kube-api-access-w9zpq" (OuterVolumeSpecName: "kube-api-access-w9zpq") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "kube-api-access-w9zpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.715925 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9zpq\" (UniqueName: \"kubernetes.io/projected/d7461168-b46b-48da-ace3-8f02feb49468-kube-api-access-w9zpq\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.753724 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.758326 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.758730 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.764815 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.781211 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.794439 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-config" (OuterVolumeSpecName: "config") pod "d7461168-b46b-48da-ace3-8f02feb49468" (UID: "d7461168-b46b-48da-ace3-8f02feb49468"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.814826 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7461168-b46b-48da-ace3-8f02feb49468" containerID="2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c" exitCode=0 Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.814863 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.814873 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" event={"ID":"d7461168-b46b-48da-ace3-8f02feb49468","Type":"ContainerDied","Data":"2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c"} Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.814948 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-zxcrp" event={"ID":"d7461168-b46b-48da-ace3-8f02feb49468","Type":"ContainerDied","Data":"a7e840b2b4f85fa4123e00a0e223ba92219273353e329177960b7060899c0f8f"} Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.814976 4907 scope.go:117] "RemoveContainer" containerID="2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.816909 4907 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.816931 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.816941 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.816952 4907 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.816960 4907 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.816968 4907 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7461168-b46b-48da-ace3-8f02feb49468-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.836662 4907 scope.go:117] "RemoveContainer" containerID="0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.848229 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-zxcrp"] Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.857047 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-zxcrp"] Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.859027 4907 scope.go:117] "RemoveContainer" containerID="2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c" Feb 26 16:09:13 crc kubenswrapper[4907]: E0226 16:09:13.859508 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c\": container with ID starting with 2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c not found: ID does not exist" containerID="2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.859539 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c"} err="failed to get container status \"2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c\": rpc error: code = NotFound desc = could not find container \"2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c\": container with ID starting with 2f7bc85953cd691616e1b468caa7e0aed58aae1cd556230ba4f51f62939c364c not found: ID does not exist" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.859568 4907 scope.go:117] "RemoveContainer" containerID="0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713" Feb 26 16:09:13 crc kubenswrapper[4907]: E0226 16:09:13.859824 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713\": container with ID starting with 0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713 not found: ID does not exist" containerID="0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713" Feb 26 16:09:13 crc kubenswrapper[4907]: I0226 16:09:13.859857 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713"} err="failed to get container status \"0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713\": rpc error: code = NotFound desc = could not find container \"0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713\": container with ID starting with 0004545745de3dce89970c2b01267a014286255432ec4d67bba839a93cd33713 not found: ID does not exist" Feb 26 16:09:14 crc kubenswrapper[4907]: I0226 16:09:14.128060 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:09:14 crc kubenswrapper[4907]: E0226 16:09:14.128679 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:09:14 crc kubenswrapper[4907]: I0226 16:09:14.139976 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7461168-b46b-48da-ace3-8f02feb49468" path="/var/lib/kubelet/pods/d7461168-b46b-48da-ace3-8f02feb49468/volumes" Feb 26 16:09:15 crc kubenswrapper[4907]: I0226 16:09:15.519816 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:09:15 crc kubenswrapper[4907]: I0226 16:09:15.581518 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:09:16 crc kubenswrapper[4907]: I0226 16:09:16.377928 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sqrbr"] Feb 26 16:09:16 crc kubenswrapper[4907]: I0226 16:09:16.847390 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sqrbr" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" containerID="cri-o://664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8" gracePeriod=2 Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.306408 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.386611 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-utilities\") pod \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.386707 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj4gx\" (UniqueName: \"kubernetes.io/projected/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-kube-api-access-nj4gx\") pod \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.386739 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-catalog-content\") pod \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\" (UID: \"6b6d9f98-d446-4b48-bd17-1c6c3ab80460\") " Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.387699 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-utilities" (OuterVolumeSpecName: "utilities") pod "6b6d9f98-d446-4b48-bd17-1c6c3ab80460" (UID: "6b6d9f98-d446-4b48-bd17-1c6c3ab80460"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.393093 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-kube-api-access-nj4gx" (OuterVolumeSpecName: "kube-api-access-nj4gx") pod "6b6d9f98-d446-4b48-bd17-1c6c3ab80460" (UID: "6b6d9f98-d446-4b48-bd17-1c6c3ab80460"). InnerVolumeSpecName "kube-api-access-nj4gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.488782 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.488817 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj4gx\" (UniqueName: \"kubernetes.io/projected/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-kube-api-access-nj4gx\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.515006 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b6d9f98-d446-4b48-bd17-1c6c3ab80460" (UID: "6b6d9f98-d446-4b48-bd17-1c6c3ab80460"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.590626 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b6d9f98-d446-4b48-bd17-1c6c3ab80460-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.856115 4907 generic.go:334] "Generic (PLEG): container finished" podID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerID="664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8" exitCode=0 Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.856157 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerDied","Data":"664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8"} Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.856181 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sqrbr" event={"ID":"6b6d9f98-d446-4b48-bd17-1c6c3ab80460","Type":"ContainerDied","Data":"6a2a66f2152967ccdb479707b45fcfad2a13f826d93a3c2d94bb166133ae85f8"} Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.856196 4907 scope.go:117] "RemoveContainer" containerID="664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.856319 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sqrbr" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.917246 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sqrbr"] Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.927189 4907 scope.go:117] "RemoveContainer" containerID="e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562" Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.939240 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sqrbr"] Feb 26 16:09:17 crc kubenswrapper[4907]: I0226 16:09:17.966281 4907 scope.go:117] "RemoveContainer" containerID="9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.021441 4907 scope.go:117] "RemoveContainer" containerID="664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8" Feb 26 16:09:18 crc kubenswrapper[4907]: E0226 16:09:18.021942 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8\": container with ID starting with 664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8 not found: ID does not exist" containerID="664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.021971 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8"} err="failed to get container status \"664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8\": rpc error: code = NotFound desc = could not find container \"664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8\": container with ID starting with 664df51b4d2bd362c44747d65125cc915512f4154cc72e9608ea2ff884f601c8 not found: ID does not exist" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.021989 4907 scope.go:117] "RemoveContainer" containerID="e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562" Feb 26 16:09:18 crc kubenswrapper[4907]: E0226 16:09:18.022391 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562\": container with ID starting with e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562 not found: ID does not exist" containerID="e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.022408 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562"} err="failed to get container status \"e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562\": rpc error: code = NotFound desc = could not find container \"e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562\": container with ID starting with e60c064b910639f18922aacb712f4365fc44c1c697d5c006112793fc9afd7562 not found: ID does not exist" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.022421 4907 scope.go:117] "RemoveContainer" containerID="9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313" Feb 26 16:09:18 crc kubenswrapper[4907]: E0226 16:09:18.022751 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313\": container with ID starting with 9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313 not found: ID does not exist" containerID="9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.022806 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313"} err="failed to get container status \"9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313\": rpc error: code = NotFound desc = could not find container \"9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313\": container with ID starting with 9a3ff335f115aa89bac206470ff5fd85c2e144cfbfd8da0f9cb56babc83c6313 not found: ID does not exist" Feb 26 16:09:18 crc kubenswrapper[4907]: I0226 16:09:18.138127 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" path="/var/lib/kubelet/pods/6b6d9f98-d446-4b48-bd17-1c6c3ab80460/volumes" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.682858 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hp77p"] Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683851 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="dnsmasq-dns" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683870 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="dnsmasq-dns" Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683891 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="extract-utilities" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683897 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="extract-utilities" Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683911 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683918 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683936 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7461168-b46b-48da-ace3-8f02feb49468" containerName="dnsmasq-dns" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683942 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7461168-b46b-48da-ace3-8f02feb49468" containerName="dnsmasq-dns" Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683954 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="init" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683959 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="init" Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683972 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7461168-b46b-48da-ace3-8f02feb49468" containerName="init" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683978 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7461168-b46b-48da-ace3-8f02feb49468" containerName="init" Feb 26 16:09:24 crc kubenswrapper[4907]: E0226 16:09:24.683984 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="extract-content" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.683989 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="extract-content" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.684225 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6d9f98-d446-4b48-bd17-1c6c3ab80460" containerName="registry-server" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.684275 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7461168-b46b-48da-ace3-8f02feb49468" containerName="dnsmasq-dns" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.684287 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac59eb1-73ee-4a73-90bd-2273f03c9498" containerName="dnsmasq-dns" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.685719 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.702620 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hp77p"] Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.731482 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-utilities\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.731556 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-catalog-content\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.731644 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlt5p\" (UniqueName: \"kubernetes.io/projected/63935def-a32a-4fc3-8d27-be330a4021f7-kube-api-access-nlt5p\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.833511 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-utilities\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.833581 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-catalog-content\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.833647 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlt5p\" (UniqueName: \"kubernetes.io/projected/63935def-a32a-4fc3-8d27-be330a4021f7-kube-api-access-nlt5p\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.834339 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-catalog-content\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.834518 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-utilities\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:24 crc kubenswrapper[4907]: I0226 16:09:24.863688 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlt5p\" (UniqueName: \"kubernetes.io/projected/63935def-a32a-4fc3-8d27-be330a4021f7-kube-api-access-nlt5p\") pod \"redhat-marketplace-hp77p\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.007602 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.511097 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hp77p"] Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.928878 4907 generic.go:334] "Generic (PLEG): container finished" podID="cbc69627-1691-43df-a77a-ca3e26e67aaa" containerID="cfa426cc7a88874ef8ecfe8ba9e28030e4c01f362780915f57f7773545098985" exitCode=0 Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.928961 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cbc69627-1691-43df-a77a-ca3e26e67aaa","Type":"ContainerDied","Data":"cfa426cc7a88874ef8ecfe8ba9e28030e4c01f362780915f57f7773545098985"} Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.932778 4907 generic.go:334] "Generic (PLEG): container finished" podID="63935def-a32a-4fc3-8d27-be330a4021f7" containerID="54362d014aedd3ff2246f99a11130fe3cff763130542fad26e92043f4e727d76" exitCode=0 Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.932844 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerDied","Data":"54362d014aedd3ff2246f99a11130fe3cff763130542fad26e92043f4e727d76"} Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.932870 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerStarted","Data":"7f145a25e29046ab2774c8aefacdd77c6720ab5c45148393ae268884035855db"} Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.938198 4907 generic.go:334] "Generic (PLEG): container finished" podID="20078d55-ee5c-4818-9ff9-4089683c9729" containerID="9db1180a5b788d55ed4b83bc22fc255fd7ef0c4fe20b53a1c4cddd51d73850d5" exitCode=0 Feb 26 16:09:25 crc kubenswrapper[4907]: I0226 16:09:25.938252 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20078d55-ee5c-4818-9ff9-4089683c9729","Type":"ContainerDied","Data":"9db1180a5b788d55ed4b83bc22fc255fd7ef0c4fe20b53a1c4cddd51d73850d5"} Feb 26 16:09:26 crc kubenswrapper[4907]: I0226 16:09:26.130923 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:09:26 crc kubenswrapper[4907]: E0226 16:09:26.131162 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:09:26 crc kubenswrapper[4907]: I0226 16:09:26.949297 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerStarted","Data":"d7a64676d1e6d709f27b3cd03a37d05564c7e04c8fc8eeebca0c5b3da6bbc716"} Feb 26 16:09:26 crc kubenswrapper[4907]: I0226 16:09:26.952812 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"20078d55-ee5c-4818-9ff9-4089683c9729","Type":"ContainerStarted","Data":"a942566375aba220dc0473ddfdcb3325119c0e43bf586ea8b6219836bf87b42e"} Feb 26 16:09:26 crc kubenswrapper[4907]: I0226 16:09:26.952992 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 16:09:26 crc kubenswrapper[4907]: I0226 16:09:26.956786 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"cbc69627-1691-43df-a77a-ca3e26e67aaa","Type":"ContainerStarted","Data":"652e81f2c9795c240d2dbfea0d1d41f7274ce8e347fb7afa5b4da67a0bd109a3"} Feb 26 16:09:26 crc kubenswrapper[4907]: I0226 16:09:26.957000 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:09:27 crc kubenswrapper[4907]: I0226 16:09:27.039299 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.03928403 podStartE2EDuration="37.03928403s" podCreationTimestamp="2026-02-26 16:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:09:27.038499831 +0000 UTC m=+1629.557061680" watchObservedRunningTime="2026-02-26 16:09:27.03928403 +0000 UTC m=+1629.557845879" Feb 26 16:09:27 crc kubenswrapper[4907]: I0226 16:09:27.040704 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.040698655 podStartE2EDuration="37.040698655s" podCreationTimestamp="2026-02-26 16:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:09:27.020685171 +0000 UTC m=+1629.539247020" watchObservedRunningTime="2026-02-26 16:09:27.040698655 +0000 UTC m=+1629.559260504" Feb 26 16:09:27 crc kubenswrapper[4907]: I0226 16:09:27.967860 4907 generic.go:334] "Generic (PLEG): container finished" podID="63935def-a32a-4fc3-8d27-be330a4021f7" containerID="d7a64676d1e6d709f27b3cd03a37d05564c7e04c8fc8eeebca0c5b3da6bbc716" exitCode=0 Feb 26 16:09:27 crc kubenswrapper[4907]: I0226 16:09:27.967970 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerDied","Data":"d7a64676d1e6d709f27b3cd03a37d05564c7e04c8fc8eeebca0c5b3da6bbc716"} Feb 26 16:09:28 crc kubenswrapper[4907]: I0226 16:09:28.979443 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerStarted","Data":"0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2"} Feb 26 16:09:29 crc kubenswrapper[4907]: I0226 16:09:29.004610 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hp77p" podStartSLOduration=2.301600319 podStartE2EDuration="5.004558747s" podCreationTimestamp="2026-02-26 16:09:24 +0000 UTC" firstStartedPulling="2026-02-26 16:09:25.933924183 +0000 UTC m=+1628.452486032" lastFinishedPulling="2026-02-26 16:09:28.636882611 +0000 UTC m=+1631.155444460" observedRunningTime="2026-02-26 16:09:28.996237232 +0000 UTC m=+1631.514799071" watchObservedRunningTime="2026-02-26 16:09:29.004558747 +0000 UTC m=+1631.523120596" Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.896982 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8"] Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.899453 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.901731 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.901733 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.902413 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.904701 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:09:31 crc kubenswrapper[4907]: I0226 16:09:31.921016 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8"] Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.072516 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk2bd\" (UniqueName: \"kubernetes.io/projected/47906d66-a8ce-445d-a71c-63f5bcfb6902-kube-api-access-jk2bd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.072627 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.073043 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.073188 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.174554 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.174639 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk2bd\" (UniqueName: \"kubernetes.io/projected/47906d66-a8ce-445d-a71c-63f5bcfb6902-kube-api-access-jk2bd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.174674 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.174770 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.183771 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.185714 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.191157 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.209863 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk2bd\" (UniqueName: \"kubernetes.io/projected/47906d66-a8ce-445d-a71c-63f5bcfb6902-kube-api-access-jk2bd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:32 crc kubenswrapper[4907]: I0226 16:09:32.219934 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:09:33 crc kubenswrapper[4907]: I0226 16:09:33.108970 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8"] Feb 26 16:09:34 crc kubenswrapper[4907]: I0226 16:09:34.031317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" event={"ID":"47906d66-a8ce-445d-a71c-63f5bcfb6902","Type":"ContainerStarted","Data":"161dab2158fae3ab6549e49ed0fb73f7487d57dd34086f8e509cddaf767b916b"} Feb 26 16:09:35 crc kubenswrapper[4907]: I0226 16:09:35.007852 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:35 crc kubenswrapper[4907]: I0226 16:09:35.007949 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:35 crc kubenswrapper[4907]: I0226 16:09:35.082143 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:36 crc kubenswrapper[4907]: I0226 16:09:36.141378 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:36 crc kubenswrapper[4907]: I0226 16:09:36.192682 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hp77p"] Feb 26 16:09:38 crc kubenswrapper[4907]: I0226 16:09:38.088304 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hp77p" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="registry-server" containerID="cri-o://0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2" gracePeriod=2 Feb 26 16:09:38 crc kubenswrapper[4907]: I0226 16:09:38.137934 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:09:38 crc kubenswrapper[4907]: E0226 16:09:38.138214 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:09:39 crc kubenswrapper[4907]: I0226 16:09:39.103544 4907 generic.go:334] "Generic (PLEG): container finished" podID="63935def-a32a-4fc3-8d27-be330a4021f7" containerID="0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2" exitCode=0 Feb 26 16:09:39 crc kubenswrapper[4907]: I0226 16:09:39.103605 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerDied","Data":"0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2"} Feb 26 16:09:41 crc kubenswrapper[4907]: I0226 16:09:41.298777 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 16:09:41 crc kubenswrapper[4907]: I0226 16:09:41.345814 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 16:09:45 crc kubenswrapper[4907]: E0226 16:09:45.008073 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2 is running failed: container process not found" containerID="0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 16:09:45 crc kubenswrapper[4907]: E0226 16:09:45.010418 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2 is running failed: container process not found" containerID="0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 16:09:45 crc kubenswrapper[4907]: E0226 16:09:45.010819 4907 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2 is running failed: container process not found" containerID="0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2" cmd=["grpc_health_probe","-addr=:50051"] Feb 26 16:09:45 crc kubenswrapper[4907]: E0226 16:09:45.010878 4907 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-hp77p" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="registry-server" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.647172 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.815288 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-catalog-content\") pod \"63935def-a32a-4fc3-8d27-be330a4021f7\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.815652 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-utilities\") pod \"63935def-a32a-4fc3-8d27-be330a4021f7\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.815700 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlt5p\" (UniqueName: \"kubernetes.io/projected/63935def-a32a-4fc3-8d27-be330a4021f7-kube-api-access-nlt5p\") pod \"63935def-a32a-4fc3-8d27-be330a4021f7\" (UID: \"63935def-a32a-4fc3-8d27-be330a4021f7\") " Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.824173 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-utilities" (OuterVolumeSpecName: "utilities") pod "63935def-a32a-4fc3-8d27-be330a4021f7" (UID: "63935def-a32a-4fc3-8d27-be330a4021f7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.830827 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63935def-a32a-4fc3-8d27-be330a4021f7-kube-api-access-nlt5p" (OuterVolumeSpecName: "kube-api-access-nlt5p") pod "63935def-a32a-4fc3-8d27-be330a4021f7" (UID: "63935def-a32a-4fc3-8d27-be330a4021f7"). InnerVolumeSpecName "kube-api-access-nlt5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.858264 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63935def-a32a-4fc3-8d27-be330a4021f7" (UID: "63935def-a32a-4fc3-8d27-be330a4021f7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.926037 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.927210 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63935def-a32a-4fc3-8d27-be330a4021f7-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:47 crc kubenswrapper[4907]: I0226 16:09:47.927222 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlt5p\" (UniqueName: \"kubernetes.io/projected/63935def-a32a-4fc3-8d27-be330a4021f7-kube-api-access-nlt5p\") on node \"crc\" DevicePath \"\"" Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.209795 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hp77p" event={"ID":"63935def-a32a-4fc3-8d27-be330a4021f7","Type":"ContainerDied","Data":"7f145a25e29046ab2774c8aefacdd77c6720ab5c45148393ae268884035855db"} Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.209897 4907 scope.go:117] "RemoveContainer" containerID="0f2e43df3779ad08a6ee84e51680af0c607db79325c5abc57e8189060a7ab5c2" Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.209828 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hp77p" Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.212872 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" event={"ID":"47906d66-a8ce-445d-a71c-63f5bcfb6902","Type":"ContainerStarted","Data":"5efa3385242c108002b465bb996d5a41566b975445a0b6bcc8e95cffaf3f797f"} Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.248357 4907 scope.go:117] "RemoveContainer" containerID="d7a64676d1e6d709f27b3cd03a37d05564c7e04c8fc8eeebca0c5b3da6bbc716" Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.250282 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hp77p"] Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.273723 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hp77p"] Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.282870 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" podStartSLOduration=2.651119327 podStartE2EDuration="17.282847814s" podCreationTimestamp="2026-02-26 16:09:31 +0000 UTC" firstStartedPulling="2026-02-26 16:09:33.118310803 +0000 UTC m=+1635.636872682" lastFinishedPulling="2026-02-26 16:09:47.75003933 +0000 UTC m=+1650.268601169" observedRunningTime="2026-02-26 16:09:48.254972567 +0000 UTC m=+1650.773534436" watchObservedRunningTime="2026-02-26 16:09:48.282847814 +0000 UTC m=+1650.801409683" Feb 26 16:09:48 crc kubenswrapper[4907]: I0226 16:09:48.295508 4907 scope.go:117] "RemoveContainer" containerID="54362d014aedd3ff2246f99a11130fe3cff763130542fad26e92043f4e727d76" Feb 26 16:09:50 crc kubenswrapper[4907]: I0226 16:09:50.138635 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" path="/var/lib/kubelet/pods/63935def-a32a-4fc3-8d27-be330a4021f7/volumes" Feb 26 16:09:52 crc kubenswrapper[4907]: I0226 16:09:52.127197 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:09:52 crc kubenswrapper[4907]: E0226 16:09:52.127988 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:09:59 crc kubenswrapper[4907]: I0226 16:09:59.310290 4907 generic.go:334] "Generic (PLEG): container finished" podID="47906d66-a8ce-445d-a71c-63f5bcfb6902" containerID="5efa3385242c108002b465bb996d5a41566b975445a0b6bcc8e95cffaf3f797f" exitCode=0 Feb 26 16:09:59 crc kubenswrapper[4907]: I0226 16:09:59.310858 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" event={"ID":"47906d66-a8ce-445d-a71c-63f5bcfb6902","Type":"ContainerDied","Data":"5efa3385242c108002b465bb996d5a41566b975445a0b6bcc8e95cffaf3f797f"} Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.153556 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zccfm"] Feb 26 16:10:00 crc kubenswrapper[4907]: E0226 16:10:00.154341 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="registry-server" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.154445 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="registry-server" Feb 26 16:10:00 crc kubenswrapper[4907]: E0226 16:10:00.154527 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="extract-utilities" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.154620 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="extract-utilities" Feb 26 16:10:00 crc kubenswrapper[4907]: E0226 16:10:00.154714 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="extract-content" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.154791 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="extract-content" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.155092 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="63935def-a32a-4fc3-8d27-be330a4021f7" containerName="registry-server" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.155936 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.159391 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.159522 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.159675 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.177924 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zccfm"] Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.274964 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2kz2\" (UniqueName: \"kubernetes.io/projected/5f78c496-8bc4-46c1-921d-1cdd80305a4b-kube-api-access-f2kz2\") pod \"auto-csr-approver-29535370-zccfm\" (UID: \"5f78c496-8bc4-46c1-921d-1cdd80305a4b\") " pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.378158 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2kz2\" (UniqueName: \"kubernetes.io/projected/5f78c496-8bc4-46c1-921d-1cdd80305a4b-kube-api-access-f2kz2\") pod \"auto-csr-approver-29535370-zccfm\" (UID: \"5f78c496-8bc4-46c1-921d-1cdd80305a4b\") " pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.405487 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2kz2\" (UniqueName: \"kubernetes.io/projected/5f78c496-8bc4-46c1-921d-1cdd80305a4b-kube-api-access-f2kz2\") pod \"auto-csr-approver-29535370-zccfm\" (UID: \"5f78c496-8bc4-46c1-921d-1cdd80305a4b\") " pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.489061 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:00 crc kubenswrapper[4907]: I0226 16:10:00.890322 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.007377 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-ssh-key-openstack-edpm-ipam\") pod \"47906d66-a8ce-445d-a71c-63f5bcfb6902\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.007435 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-inventory\") pod \"47906d66-a8ce-445d-a71c-63f5bcfb6902\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.007469 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-repo-setup-combined-ca-bundle\") pod \"47906d66-a8ce-445d-a71c-63f5bcfb6902\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.007500 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk2bd\" (UniqueName: \"kubernetes.io/projected/47906d66-a8ce-445d-a71c-63f5bcfb6902-kube-api-access-jk2bd\") pod \"47906d66-a8ce-445d-a71c-63f5bcfb6902\" (UID: \"47906d66-a8ce-445d-a71c-63f5bcfb6902\") " Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.012645 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47906d66-a8ce-445d-a71c-63f5bcfb6902-kube-api-access-jk2bd" (OuterVolumeSpecName: "kube-api-access-jk2bd") pod "47906d66-a8ce-445d-a71c-63f5bcfb6902" (UID: "47906d66-a8ce-445d-a71c-63f5bcfb6902"). InnerVolumeSpecName "kube-api-access-jk2bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.017472 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "47906d66-a8ce-445d-a71c-63f5bcfb6902" (UID: "47906d66-a8ce-445d-a71c-63f5bcfb6902"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.037880 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-inventory" (OuterVolumeSpecName: "inventory") pod "47906d66-a8ce-445d-a71c-63f5bcfb6902" (UID: "47906d66-a8ce-445d-a71c-63f5bcfb6902"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.047945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "47906d66-a8ce-445d-a71c-63f5bcfb6902" (UID: "47906d66-a8ce-445d-a71c-63f5bcfb6902"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.056819 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zccfm"] Feb 26 16:10:01 crc kubenswrapper[4907]: W0226 16:10:01.059914 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f78c496_8bc4_46c1_921d_1cdd80305a4b.slice/crio-d4df7ae68394b140e2eb2348d1e2038ab12740b2774bc447383049fd50a1595d WatchSource:0}: Error finding container d4df7ae68394b140e2eb2348d1e2038ab12740b2774bc447383049fd50a1595d: Status 404 returned error can't find the container with id d4df7ae68394b140e2eb2348d1e2038ab12740b2774bc447383049fd50a1595d Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.110309 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.110543 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.110648 4907 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47906d66-a8ce-445d-a71c-63f5bcfb6902-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.110708 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk2bd\" (UniqueName: \"kubernetes.io/projected/47906d66-a8ce-445d-a71c-63f5bcfb6902-kube-api-access-jk2bd\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.327975 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.327974 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8" event={"ID":"47906d66-a8ce-445d-a71c-63f5bcfb6902","Type":"ContainerDied","Data":"161dab2158fae3ab6549e49ed0fb73f7487d57dd34086f8e509cddaf767b916b"} Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.328110 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="161dab2158fae3ab6549e49ed0fb73f7487d57dd34086f8e509cddaf767b916b" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.329455 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zccfm" event={"ID":"5f78c496-8bc4-46c1-921d-1cdd80305a4b","Type":"ContainerStarted","Data":"d4df7ae68394b140e2eb2348d1e2038ab12740b2774bc447383049fd50a1595d"} Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.414793 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v"] Feb 26 16:10:01 crc kubenswrapper[4907]: E0226 16:10:01.415214 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47906d66-a8ce-445d-a71c-63f5bcfb6902" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.415229 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="47906d66-a8ce-445d-a71c-63f5bcfb6902" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.416106 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="47906d66-a8ce-445d-a71c-63f5bcfb6902" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.416807 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.418488 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.418880 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.418922 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.419312 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.443566 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v"] Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.518986 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh7vx\" (UniqueName: \"kubernetes.io/projected/744e4551-7f1b-4a7e-a907-2e2fd05053e1-kube-api-access-rh7vx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.519049 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.519113 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.621043 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.621175 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh7vx\" (UniqueName: \"kubernetes.io/projected/744e4551-7f1b-4a7e-a907-2e2fd05053e1-kube-api-access-rh7vx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.621214 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.625805 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.656278 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.673317 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh7vx\" (UniqueName: \"kubernetes.io/projected/744e4551-7f1b-4a7e-a907-2e2fd05053e1-kube-api-access-rh7vx\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-nzx6v\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:01 crc kubenswrapper[4907]: I0226 16:10:01.789775 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:02 crc kubenswrapper[4907]: W0226 16:10:02.360664 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod744e4551_7f1b_4a7e_a907_2e2fd05053e1.slice/crio-04e213e8539d3b231ed1e8981b667ca7ff81e54999ca84482b247a414d6f6813 WatchSource:0}: Error finding container 04e213e8539d3b231ed1e8981b667ca7ff81e54999ca84482b247a414d6f6813: Status 404 returned error can't find the container with id 04e213e8539d3b231ed1e8981b667ca7ff81e54999ca84482b247a414d6f6813 Feb 26 16:10:02 crc kubenswrapper[4907]: I0226 16:10:02.364308 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v"] Feb 26 16:10:03 crc kubenswrapper[4907]: I0226 16:10:03.127195 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:10:03 crc kubenswrapper[4907]: E0226 16:10:03.127771 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:10:03 crc kubenswrapper[4907]: I0226 16:10:03.349499 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" event={"ID":"744e4551-7f1b-4a7e-a907-2e2fd05053e1","Type":"ContainerStarted","Data":"04e213e8539d3b231ed1e8981b667ca7ff81e54999ca84482b247a414d6f6813"} Feb 26 16:10:03 crc kubenswrapper[4907]: I0226 16:10:03.351313 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zccfm" event={"ID":"5f78c496-8bc4-46c1-921d-1cdd80305a4b","Type":"ContainerStarted","Data":"6ebcd10ca8374898bddc79455029b3e3d94bf890a8b3f68c17e5b18a05348826"} Feb 26 16:10:03 crc kubenswrapper[4907]: I0226 16:10:03.365343 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535370-zccfm" podStartSLOduration=1.760578296 podStartE2EDuration="3.365321692s" podCreationTimestamp="2026-02-26 16:10:00 +0000 UTC" firstStartedPulling="2026-02-26 16:10:01.062530443 +0000 UTC m=+1663.581092292" lastFinishedPulling="2026-02-26 16:10:02.667273839 +0000 UTC m=+1665.185835688" observedRunningTime="2026-02-26 16:10:03.362345629 +0000 UTC m=+1665.880907488" watchObservedRunningTime="2026-02-26 16:10:03.365321692 +0000 UTC m=+1665.883883541" Feb 26 16:10:04 crc kubenswrapper[4907]: I0226 16:10:04.362952 4907 generic.go:334] "Generic (PLEG): container finished" podID="5f78c496-8bc4-46c1-921d-1cdd80305a4b" containerID="6ebcd10ca8374898bddc79455029b3e3d94bf890a8b3f68c17e5b18a05348826" exitCode=0 Feb 26 16:10:04 crc kubenswrapper[4907]: I0226 16:10:04.363044 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zccfm" event={"ID":"5f78c496-8bc4-46c1-921d-1cdd80305a4b","Type":"ContainerDied","Data":"6ebcd10ca8374898bddc79455029b3e3d94bf890a8b3f68c17e5b18a05348826"} Feb 26 16:10:05 crc kubenswrapper[4907]: I0226 16:10:05.374255 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" event={"ID":"744e4551-7f1b-4a7e-a907-2e2fd05053e1","Type":"ContainerStarted","Data":"830566899f2ec453041b47b4dc3282ffece059723c49007c1b621fec02a663a9"} Feb 26 16:10:05 crc kubenswrapper[4907]: I0226 16:10:05.447066 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" podStartSLOduration=2.66971334 podStartE2EDuration="4.447039128s" podCreationTimestamp="2026-02-26 16:10:01 +0000 UTC" firstStartedPulling="2026-02-26 16:10:02.363526027 +0000 UTC m=+1664.882087876" lastFinishedPulling="2026-02-26 16:10:04.140851815 +0000 UTC m=+1666.659413664" observedRunningTime="2026-02-26 16:10:05.424726628 +0000 UTC m=+1667.943288487" watchObservedRunningTime="2026-02-26 16:10:05.447039128 +0000 UTC m=+1667.965600997" Feb 26 16:10:05 crc kubenswrapper[4907]: I0226 16:10:05.778738 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:05 crc kubenswrapper[4907]: I0226 16:10:05.921336 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2kz2\" (UniqueName: \"kubernetes.io/projected/5f78c496-8bc4-46c1-921d-1cdd80305a4b-kube-api-access-f2kz2\") pod \"5f78c496-8bc4-46c1-921d-1cdd80305a4b\" (UID: \"5f78c496-8bc4-46c1-921d-1cdd80305a4b\") " Feb 26 16:10:05 crc kubenswrapper[4907]: I0226 16:10:05.927040 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f78c496-8bc4-46c1-921d-1cdd80305a4b-kube-api-access-f2kz2" (OuterVolumeSpecName: "kube-api-access-f2kz2") pod "5f78c496-8bc4-46c1-921d-1cdd80305a4b" (UID: "5f78c496-8bc4-46c1-921d-1cdd80305a4b"). InnerVolumeSpecName "kube-api-access-f2kz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:10:06 crc kubenswrapper[4907]: I0226 16:10:06.023710 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2kz2\" (UniqueName: \"kubernetes.io/projected/5f78c496-8bc4-46c1-921d-1cdd80305a4b-kube-api-access-f2kz2\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:06 crc kubenswrapper[4907]: I0226 16:10:06.388691 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535370-zccfm" Feb 26 16:10:06 crc kubenswrapper[4907]: I0226 16:10:06.388703 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535370-zccfm" event={"ID":"5f78c496-8bc4-46c1-921d-1cdd80305a4b","Type":"ContainerDied","Data":"d4df7ae68394b140e2eb2348d1e2038ab12740b2774bc447383049fd50a1595d"} Feb 26 16:10:06 crc kubenswrapper[4907]: I0226 16:10:06.388750 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4df7ae68394b140e2eb2348d1e2038ab12740b2774bc447383049fd50a1595d" Feb 26 16:10:06 crc kubenswrapper[4907]: I0226 16:10:06.434384 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-t8qd8"] Feb 26 16:10:06 crc kubenswrapper[4907]: I0226 16:10:06.443736 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535364-t8qd8"] Feb 26 16:10:07 crc kubenswrapper[4907]: I0226 16:10:07.399808 4907 generic.go:334] "Generic (PLEG): container finished" podID="744e4551-7f1b-4a7e-a907-2e2fd05053e1" containerID="830566899f2ec453041b47b4dc3282ffece059723c49007c1b621fec02a663a9" exitCode=0 Feb 26 16:10:07 crc kubenswrapper[4907]: I0226 16:10:07.399922 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" event={"ID":"744e4551-7f1b-4a7e-a907-2e2fd05053e1","Type":"ContainerDied","Data":"830566899f2ec453041b47b4dc3282ffece059723c49007c1b621fec02a663a9"} Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.140127 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2b66b18-ac41-4d84-9ae1-5900c27d0d7d" path="/var/lib/kubelet/pods/b2b66b18-ac41-4d84-9ae1-5900c27d0d7d/volumes" Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.221179 4907 scope.go:117] "RemoveContainer" containerID="30b2bb90b711626ce57caa8880e3ecc1df500c89c700220a73f326eac4fdd679" Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.263793 4907 scope.go:117] "RemoveContainer" containerID="9cc7bb5362346ca08fbf3ac2c9729be8d3c6896f7dc05d217c7d67f0078f3f7c" Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.298424 4907 scope.go:117] "RemoveContainer" containerID="60555399c60d59b9505adf79bd8540d91978a0ca1c4ac2cf50c79fea6ee3e31d" Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.348602 4907 scope.go:117] "RemoveContainer" containerID="a1c12bb904185e9dd91c784f19d207ab94fd40089de7556c864ea01a85875ce2" Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.440485 4907 scope.go:117] "RemoveContainer" containerID="0a5d6f60f71c3e5324ad26a0af022cce4cf805448af7ede8f99bf4081a825aaa" Feb 26 16:10:08 crc kubenswrapper[4907]: I0226 16:10:08.815566 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.000439 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-ssh-key-openstack-edpm-ipam\") pod \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.000625 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-inventory\") pod \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.000953 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh7vx\" (UniqueName: \"kubernetes.io/projected/744e4551-7f1b-4a7e-a907-2e2fd05053e1-kube-api-access-rh7vx\") pod \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\" (UID: \"744e4551-7f1b-4a7e-a907-2e2fd05053e1\") " Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.008217 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/744e4551-7f1b-4a7e-a907-2e2fd05053e1-kube-api-access-rh7vx" (OuterVolumeSpecName: "kube-api-access-rh7vx") pod "744e4551-7f1b-4a7e-a907-2e2fd05053e1" (UID: "744e4551-7f1b-4a7e-a907-2e2fd05053e1"). InnerVolumeSpecName "kube-api-access-rh7vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.033552 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "744e4551-7f1b-4a7e-a907-2e2fd05053e1" (UID: "744e4551-7f1b-4a7e-a907-2e2fd05053e1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.043002 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-inventory" (OuterVolumeSpecName: "inventory") pod "744e4551-7f1b-4a7e-a907-2e2fd05053e1" (UID: "744e4551-7f1b-4a7e-a907-2e2fd05053e1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.103505 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh7vx\" (UniqueName: \"kubernetes.io/projected/744e4551-7f1b-4a7e-a907-2e2fd05053e1-kube-api-access-rh7vx\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.103533 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.103543 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/744e4551-7f1b-4a7e-a907-2e2fd05053e1-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.459716 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" event={"ID":"744e4551-7f1b-4a7e-a907-2e2fd05053e1","Type":"ContainerDied","Data":"04e213e8539d3b231ed1e8981b667ca7ff81e54999ca84482b247a414d6f6813"} Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.460075 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04e213e8539d3b231ed1e8981b667ca7ff81e54999ca84482b247a414d6f6813" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.459747 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-nzx6v" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.659710 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj"] Feb 26 16:10:09 crc kubenswrapper[4907]: E0226 16:10:09.660096 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="744e4551-7f1b-4a7e-a907-2e2fd05053e1" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.660112 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="744e4551-7f1b-4a7e-a907-2e2fd05053e1" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 16:10:09 crc kubenswrapper[4907]: E0226 16:10:09.660123 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f78c496-8bc4-46c1-921d-1cdd80305a4b" containerName="oc" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.660129 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f78c496-8bc4-46c1-921d-1cdd80305a4b" containerName="oc" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.660289 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="744e4551-7f1b-4a7e-a907-2e2fd05053e1" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.660301 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f78c496-8bc4-46c1-921d-1cdd80305a4b" containerName="oc" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.660875 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.669363 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj"] Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.675383 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.675576 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.675793 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.690151 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.817845 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.817928 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.817967 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.818005 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht22h\" (UniqueName: \"kubernetes.io/projected/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-kube-api-access-ht22h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.919987 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht22h\" (UniqueName: \"kubernetes.io/projected/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-kube-api-access-ht22h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.920148 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.920222 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.920271 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.924248 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.924327 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.926147 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.946478 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht22h\" (UniqueName: \"kubernetes.io/projected/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-kube-api-access-ht22h\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:09 crc kubenswrapper[4907]: I0226 16:10:09.986916 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:10:10 crc kubenswrapper[4907]: I0226 16:10:10.527172 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj"] Feb 26 16:10:11 crc kubenswrapper[4907]: I0226 16:10:11.483236 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" event={"ID":"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf","Type":"ContainerStarted","Data":"c5712179185e459f9b6fd5cfeb54cd8ae38b47ad6683527cba4533685502f5a3"} Feb 26 16:10:11 crc kubenswrapper[4907]: I0226 16:10:11.483830 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" event={"ID":"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf","Type":"ContainerStarted","Data":"8e55e0a9d75adf3e67e0199c59c00528dbd044c0a01dc4a49d4c3b62bd7e8e56"} Feb 26 16:10:11 crc kubenswrapper[4907]: I0226 16:10:11.507708 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" podStartSLOduration=1.951777517 podStartE2EDuration="2.507687489s" podCreationTimestamp="2026-02-26 16:10:09 +0000 UTC" firstStartedPulling="2026-02-26 16:10:10.53619092 +0000 UTC m=+1673.054752769" lastFinishedPulling="2026-02-26 16:10:11.092100842 +0000 UTC m=+1673.610662741" observedRunningTime="2026-02-26 16:10:11.503384973 +0000 UTC m=+1674.021946822" watchObservedRunningTime="2026-02-26 16:10:11.507687489 +0000 UTC m=+1674.026249338" Feb 26 16:10:18 crc kubenswrapper[4907]: I0226 16:10:18.133554 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:10:18 crc kubenswrapper[4907]: E0226 16:10:18.135950 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:10:30 crc kubenswrapper[4907]: I0226 16:10:30.126403 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:10:30 crc kubenswrapper[4907]: E0226 16:10:30.127093 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.485852 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8qm75"] Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.489938 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.508616 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qm75"] Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.640490 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9rgk\" (UniqueName: \"kubernetes.io/projected/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-kube-api-access-b9rgk\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.640759 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-catalog-content\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.640804 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-utilities\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.742356 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9rgk\" (UniqueName: \"kubernetes.io/projected/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-kube-api-access-b9rgk\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.742582 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-catalog-content\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.742647 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-utilities\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.743251 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-catalog-content\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.743331 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-utilities\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.767455 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9rgk\" (UniqueName: \"kubernetes.io/projected/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-kube-api-access-b9rgk\") pod \"certified-operators-8qm75\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:41 crc kubenswrapper[4907]: I0226 16:10:41.814137 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:42 crc kubenswrapper[4907]: I0226 16:10:42.128066 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:10:42 crc kubenswrapper[4907]: E0226 16:10:42.128982 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:10:42 crc kubenswrapper[4907]: I0226 16:10:42.269362 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8qm75"] Feb 26 16:10:42 crc kubenswrapper[4907]: I0226 16:10:42.750142 4907 generic.go:334] "Generic (PLEG): container finished" podID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerID="4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120" exitCode=0 Feb 26 16:10:42 crc kubenswrapper[4907]: I0226 16:10:42.750227 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerDied","Data":"4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120"} Feb 26 16:10:42 crc kubenswrapper[4907]: I0226 16:10:42.750498 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerStarted","Data":"3b4782204cc54202f9b62db4a7e077b2349a662ab527171da7952bc2f9f76a1e"} Feb 26 16:10:44 crc kubenswrapper[4907]: I0226 16:10:44.776420 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerStarted","Data":"34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7"} Feb 26 16:10:46 crc kubenswrapper[4907]: I0226 16:10:46.795834 4907 generic.go:334] "Generic (PLEG): container finished" podID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerID="34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7" exitCode=0 Feb 26 16:10:46 crc kubenswrapper[4907]: I0226 16:10:46.795898 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerDied","Data":"34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7"} Feb 26 16:10:48 crc kubenswrapper[4907]: I0226 16:10:48.815408 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerStarted","Data":"ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c"} Feb 26 16:10:48 crc kubenswrapper[4907]: I0226 16:10:48.840429 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8qm75" podStartSLOduration=2.055403083 podStartE2EDuration="7.840409934s" podCreationTimestamp="2026-02-26 16:10:41 +0000 UTC" firstStartedPulling="2026-02-26 16:10:42.752450708 +0000 UTC m=+1705.271012557" lastFinishedPulling="2026-02-26 16:10:48.537457549 +0000 UTC m=+1711.056019408" observedRunningTime="2026-02-26 16:10:48.830206683 +0000 UTC m=+1711.348768542" watchObservedRunningTime="2026-02-26 16:10:48.840409934 +0000 UTC m=+1711.358971803" Feb 26 16:10:51 crc kubenswrapper[4907]: I0226 16:10:51.815164 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:51 crc kubenswrapper[4907]: I0226 16:10:51.815843 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:10:52 crc kubenswrapper[4907]: I0226 16:10:52.871158 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8qm75" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="registry-server" probeResult="failure" output=< Feb 26 16:10:52 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:10:52 crc kubenswrapper[4907]: > Feb 26 16:10:54 crc kubenswrapper[4907]: I0226 16:10:54.126772 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:10:54 crc kubenswrapper[4907]: E0226 16:10:54.127169 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:11:01 crc kubenswrapper[4907]: I0226 16:11:01.902367 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:11:02 crc kubenswrapper[4907]: I0226 16:11:02.018235 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:11:02 crc kubenswrapper[4907]: I0226 16:11:02.160286 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qm75"] Feb 26 16:11:02 crc kubenswrapper[4907]: I0226 16:11:02.955336 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8qm75" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="registry-server" containerID="cri-o://ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c" gracePeriod=2 Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.388768 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.578366 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9rgk\" (UniqueName: \"kubernetes.io/projected/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-kube-api-access-b9rgk\") pod \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.578763 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-utilities\") pod \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.579014 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-catalog-content\") pod \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\" (UID: \"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b\") " Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.579373 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-utilities" (OuterVolumeSpecName: "utilities") pod "18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" (UID: "18b0a81d-4f4e-4a2e-9ff1-4738ad62950b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.579826 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.592223 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-kube-api-access-b9rgk" (OuterVolumeSpecName: "kube-api-access-b9rgk") pod "18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" (UID: "18b0a81d-4f4e-4a2e-9ff1-4738ad62950b"). InnerVolumeSpecName "kube-api-access-b9rgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.643288 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" (UID: "18b0a81d-4f4e-4a2e-9ff1-4738ad62950b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.682953 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9rgk\" (UniqueName: \"kubernetes.io/projected/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-kube-api-access-b9rgk\") on node \"crc\" DevicePath \"\"" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.682988 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.964727 4907 generic.go:334] "Generic (PLEG): container finished" podID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerID="ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c" exitCode=0 Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.964774 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerDied","Data":"ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c"} Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.964788 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8qm75" Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.964804 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8qm75" event={"ID":"18b0a81d-4f4e-4a2e-9ff1-4738ad62950b","Type":"ContainerDied","Data":"3b4782204cc54202f9b62db4a7e077b2349a662ab527171da7952bc2f9f76a1e"} Feb 26 16:11:03 crc kubenswrapper[4907]: I0226 16:11:03.964825 4907 scope.go:117] "RemoveContainer" containerID="ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.003434 4907 scope.go:117] "RemoveContainer" containerID="34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.004697 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8qm75"] Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.014112 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8qm75"] Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.031305 4907 scope.go:117] "RemoveContainer" containerID="4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.064808 4907 scope.go:117] "RemoveContainer" containerID="ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c" Feb 26 16:11:04 crc kubenswrapper[4907]: E0226 16:11:04.065264 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c\": container with ID starting with ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c not found: ID does not exist" containerID="ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.065294 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c"} err="failed to get container status \"ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c\": rpc error: code = NotFound desc = could not find container \"ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c\": container with ID starting with ff80d64f4f3856f1e28715ac5f515244d5b9c69677d1ccb43d69fcbaaaf2b63c not found: ID does not exist" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.065318 4907 scope.go:117] "RemoveContainer" containerID="34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7" Feb 26 16:11:04 crc kubenswrapper[4907]: E0226 16:11:04.065651 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7\": container with ID starting with 34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7 not found: ID does not exist" containerID="34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.065673 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7"} err="failed to get container status \"34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7\": rpc error: code = NotFound desc = could not find container \"34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7\": container with ID starting with 34b99538b9b7536ba1a7624f8bce8ad96b1e26ece2678076567a2d57ad1fbdf7 not found: ID does not exist" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.065690 4907 scope.go:117] "RemoveContainer" containerID="4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120" Feb 26 16:11:04 crc kubenswrapper[4907]: E0226 16:11:04.065894 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120\": container with ID starting with 4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120 not found: ID does not exist" containerID="4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.065915 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120"} err="failed to get container status \"4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120\": rpc error: code = NotFound desc = could not find container \"4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120\": container with ID starting with 4b78f14c5cc087a7b6a603e7fc787e9d7e6299cbdf29cdcf4af2399eb401d120 not found: ID does not exist" Feb 26 16:11:04 crc kubenswrapper[4907]: I0226 16:11:04.138552 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" path="/var/lib/kubelet/pods/18b0a81d-4f4e-4a2e-9ff1-4738ad62950b/volumes" Feb 26 16:11:07 crc kubenswrapper[4907]: I0226 16:11:07.126416 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:11:07 crc kubenswrapper[4907]: E0226 16:11:07.128054 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:11:22 crc kubenswrapper[4907]: I0226 16:11:22.126752 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:11:22 crc kubenswrapper[4907]: E0226 16:11:22.127604 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:11:33 crc kubenswrapper[4907]: I0226 16:11:33.127062 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:11:33 crc kubenswrapper[4907]: E0226 16:11:33.127896 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:11:46 crc kubenswrapper[4907]: I0226 16:11:46.130893 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:11:46 crc kubenswrapper[4907]: E0226 16:11:46.131986 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.154508 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535372-742br"] Feb 26 16:12:00 crc kubenswrapper[4907]: E0226 16:12:00.155474 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="extract-content" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.155489 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="extract-content" Feb 26 16:12:00 crc kubenswrapper[4907]: E0226 16:12:00.155502 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="extract-utilities" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.155510 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="extract-utilities" Feb 26 16:12:00 crc kubenswrapper[4907]: E0226 16:12:00.155545 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="registry-server" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.155555 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="registry-server" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.155780 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b0a81d-4f4e-4a2e-9ff1-4738ad62950b" containerName="registry-server" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.156516 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.159375 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.159377 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.159461 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.164871 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-742br"] Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.322091 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mllv\" (UniqueName: \"kubernetes.io/projected/6496c1fc-cf88-488c-bcf6-5bae57ca88bf-kube-api-access-9mllv\") pod \"auto-csr-approver-29535372-742br\" (UID: \"6496c1fc-cf88-488c-bcf6-5bae57ca88bf\") " pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.424494 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mllv\" (UniqueName: \"kubernetes.io/projected/6496c1fc-cf88-488c-bcf6-5bae57ca88bf-kube-api-access-9mllv\") pod \"auto-csr-approver-29535372-742br\" (UID: \"6496c1fc-cf88-488c-bcf6-5bae57ca88bf\") " pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.471176 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mllv\" (UniqueName: \"kubernetes.io/projected/6496c1fc-cf88-488c-bcf6-5bae57ca88bf-kube-api-access-9mllv\") pod \"auto-csr-approver-29535372-742br\" (UID: \"6496c1fc-cf88-488c-bcf6-5bae57ca88bf\") " pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.482575 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:00 crc kubenswrapper[4907]: I0226 16:12:00.925003 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-742br"] Feb 26 16:12:01 crc kubenswrapper[4907]: I0226 16:12:01.126902 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:12:01 crc kubenswrapper[4907]: E0226 16:12:01.127181 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:12:01 crc kubenswrapper[4907]: I0226 16:12:01.544632 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-742br" event={"ID":"6496c1fc-cf88-488c-bcf6-5bae57ca88bf","Type":"ContainerStarted","Data":"abef9ca2b1f334cb4af6319cf65cd8d167fea6253456d73c6b312a153b7188a3"} Feb 26 16:12:02 crc kubenswrapper[4907]: I0226 16:12:02.555647 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-742br" event={"ID":"6496c1fc-cf88-488c-bcf6-5bae57ca88bf","Type":"ContainerStarted","Data":"4bc4e33d37c8a0f3832c0c5a604e6f722245320712bc4bac5d6a6004557b6be8"} Feb 26 16:12:02 crc kubenswrapper[4907]: I0226 16:12:02.572483 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535372-742br" podStartSLOduration=1.469689019 podStartE2EDuration="2.572467206s" podCreationTimestamp="2026-02-26 16:12:00 +0000 UTC" firstStartedPulling="2026-02-26 16:12:00.922868123 +0000 UTC m=+1783.441429972" lastFinishedPulling="2026-02-26 16:12:02.0256463 +0000 UTC m=+1784.544208159" observedRunningTime="2026-02-26 16:12:02.570421026 +0000 UTC m=+1785.088982875" watchObservedRunningTime="2026-02-26 16:12:02.572467206 +0000 UTC m=+1785.091029055" Feb 26 16:12:03 crc kubenswrapper[4907]: I0226 16:12:03.567456 4907 generic.go:334] "Generic (PLEG): container finished" podID="6496c1fc-cf88-488c-bcf6-5bae57ca88bf" containerID="4bc4e33d37c8a0f3832c0c5a604e6f722245320712bc4bac5d6a6004557b6be8" exitCode=0 Feb 26 16:12:03 crc kubenswrapper[4907]: I0226 16:12:03.567517 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-742br" event={"ID":"6496c1fc-cf88-488c-bcf6-5bae57ca88bf","Type":"ContainerDied","Data":"4bc4e33d37c8a0f3832c0c5a604e6f722245320712bc4bac5d6a6004557b6be8"} Feb 26 16:12:04 crc kubenswrapper[4907]: I0226 16:12:04.877133 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:04 crc kubenswrapper[4907]: I0226 16:12:04.912939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mllv\" (UniqueName: \"kubernetes.io/projected/6496c1fc-cf88-488c-bcf6-5bae57ca88bf-kube-api-access-9mllv\") pod \"6496c1fc-cf88-488c-bcf6-5bae57ca88bf\" (UID: \"6496c1fc-cf88-488c-bcf6-5bae57ca88bf\") " Feb 26 16:12:04 crc kubenswrapper[4907]: I0226 16:12:04.937974 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6496c1fc-cf88-488c-bcf6-5bae57ca88bf-kube-api-access-9mllv" (OuterVolumeSpecName: "kube-api-access-9mllv") pod "6496c1fc-cf88-488c-bcf6-5bae57ca88bf" (UID: "6496c1fc-cf88-488c-bcf6-5bae57ca88bf"). InnerVolumeSpecName "kube-api-access-9mllv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:12:05 crc kubenswrapper[4907]: I0226 16:12:05.014811 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mllv\" (UniqueName: \"kubernetes.io/projected/6496c1fc-cf88-488c-bcf6-5bae57ca88bf-kube-api-access-9mllv\") on node \"crc\" DevicePath \"\"" Feb 26 16:12:05 crc kubenswrapper[4907]: I0226 16:12:05.587931 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535372-742br" event={"ID":"6496c1fc-cf88-488c-bcf6-5bae57ca88bf","Type":"ContainerDied","Data":"abef9ca2b1f334cb4af6319cf65cd8d167fea6253456d73c6b312a153b7188a3"} Feb 26 16:12:05 crc kubenswrapper[4907]: I0226 16:12:05.587973 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535372-742br" Feb 26 16:12:05 crc kubenswrapper[4907]: I0226 16:12:05.587978 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abef9ca2b1f334cb4af6319cf65cd8d167fea6253456d73c6b312a153b7188a3" Feb 26 16:12:05 crc kubenswrapper[4907]: I0226 16:12:05.639657 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-dqhh6"] Feb 26 16:12:05 crc kubenswrapper[4907]: I0226 16:12:05.647940 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535366-dqhh6"] Feb 26 16:12:06 crc kubenswrapper[4907]: I0226 16:12:06.140473 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="023cbc5f-da0e-4a5e-bc63-18385f44d228" path="/var/lib/kubelet/pods/023cbc5f-da0e-4a5e-bc63-18385f44d228/volumes" Feb 26 16:12:13 crc kubenswrapper[4907]: I0226 16:12:13.127067 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:12:13 crc kubenswrapper[4907]: E0226 16:12:13.127775 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:12:26 crc kubenswrapper[4907]: I0226 16:12:26.127329 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:12:26 crc kubenswrapper[4907]: E0226 16:12:26.129583 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:12:41 crc kubenswrapper[4907]: I0226 16:12:41.126869 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:12:41 crc kubenswrapper[4907]: E0226 16:12:41.127560 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:12:52 crc kubenswrapper[4907]: I0226 16:12:52.127292 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:12:52 crc kubenswrapper[4907]: E0226 16:12:52.128203 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:13:04 crc kubenswrapper[4907]: I0226 16:13:04.127306 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:13:04 crc kubenswrapper[4907]: E0226 16:13:04.128045 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:13:08 crc kubenswrapper[4907]: I0226 16:13:08.715922 4907 scope.go:117] "RemoveContainer" containerID="8e672af27d5f3faf809d9b7e50e114800481d22f3fccd9f003a26b0133dccd51" Feb 26 16:13:13 crc kubenswrapper[4907]: I0226 16:13:13.047361 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8plbn"] Feb 26 16:13:13 crc kubenswrapper[4907]: I0226 16:13:13.056037 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8plbn"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.039515 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0813-account-create-update-rz8gs"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.051376 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8hgqp"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.062196 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f375-account-create-update-rgltk"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.077571 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4cbgw"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.085501 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0813-account-create-update-rz8gs"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.093851 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d824-account-create-update-x9gtq"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.104251 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4cbgw"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.114469 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8hgqp"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.123544 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f375-account-create-update-rgltk"] Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.141726 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad824a7-419c-443b-8278-a4e806370720" path="/var/lib/kubelet/pods/3ad824a7-419c-443b-8278-a4e806370720/volumes" Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.142415 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df80d3d-7d86-44dd-a35a-3e9d9d435435" path="/var/lib/kubelet/pods/3df80d3d-7d86-44dd-a35a-3e9d9d435435/volumes" Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.143087 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d62e78-4aa2-4ae7-84dd-99e58e0deb68" path="/var/lib/kubelet/pods/56d62e78-4aa2-4ae7-84dd-99e58e0deb68/volumes" Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.145635 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841c55f4-98a9-44dd-bfc7-018ad4a44528" path="/var/lib/kubelet/pods/841c55f4-98a9-44dd-bfc7-018ad4a44528/volumes" Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.146991 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aea8ea2-97bd-4315-9335-8fbe73ab8ec2" path="/var/lib/kubelet/pods/8aea8ea2-97bd-4315-9335-8fbe73ab8ec2/volumes" Feb 26 16:13:14 crc kubenswrapper[4907]: I0226 16:13:14.148242 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d824-account-create-update-x9gtq"] Feb 26 16:13:15 crc kubenswrapper[4907]: I0226 16:13:15.269262 4907 generic.go:334] "Generic (PLEG): container finished" podID="235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" containerID="c5712179185e459f9b6fd5cfeb54cd8ae38b47ad6683527cba4533685502f5a3" exitCode=0 Feb 26 16:13:15 crc kubenswrapper[4907]: I0226 16:13:15.269316 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" event={"ID":"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf","Type":"ContainerDied","Data":"c5712179185e459f9b6fd5cfeb54cd8ae38b47ad6683527cba4533685502f5a3"} Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.146340 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3204495d-0bdb-45bd-b2df-af20221366fd" path="/var/lib/kubelet/pods/3204495d-0bdb-45bd-b2df-af20221366fd/volumes" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.673493 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.845009 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-inventory\") pod \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.845088 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht22h\" (UniqueName: \"kubernetes.io/projected/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-kube-api-access-ht22h\") pod \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.845138 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-bootstrap-combined-ca-bundle\") pod \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.845205 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-ssh-key-openstack-edpm-ipam\") pod \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\" (UID: \"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf\") " Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.851468 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-kube-api-access-ht22h" (OuterVolumeSpecName: "kube-api-access-ht22h") pod "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" (UID: "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf"). InnerVolumeSpecName "kube-api-access-ht22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.852315 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" (UID: "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.881822 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" (UID: "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.883949 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-inventory" (OuterVolumeSpecName: "inventory") pod "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" (UID: "235c91d9-1679-4ab9-b8a3-87d7fd5f68cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.947766 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.947802 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht22h\" (UniqueName: \"kubernetes.io/projected/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-kube-api-access-ht22h\") on node \"crc\" DevicePath \"\"" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.947814 4907 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:13:16 crc kubenswrapper[4907]: I0226 16:13:16.947828 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/235c91d9-1679-4ab9-b8a3-87d7fd5f68cf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.127348 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:13:17 crc kubenswrapper[4907]: E0226 16:13:17.127940 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.290901 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" event={"ID":"235c91d9-1679-4ab9-b8a3-87d7fd5f68cf","Type":"ContainerDied","Data":"8e55e0a9d75adf3e67e0199c59c00528dbd044c0a01dc4a49d4c3b62bd7e8e56"} Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.290974 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e55e0a9d75adf3e67e0199c59c00528dbd044c0a01dc4a49d4c3b62bd7e8e56" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.290927 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.412222 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r"] Feb 26 16:13:17 crc kubenswrapper[4907]: E0226 16:13:17.412850 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.412883 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 16:13:17 crc kubenswrapper[4907]: E0226 16:13:17.412927 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6496c1fc-cf88-488c-bcf6-5bae57ca88bf" containerName="oc" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.412939 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6496c1fc-cf88-488c-bcf6-5bae57ca88bf" containerName="oc" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.413281 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="235c91d9-1679-4ab9-b8a3-87d7fd5f68cf" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.413314 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6496c1fc-cf88-488c-bcf6-5bae57ca88bf" containerName="oc" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.414256 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.424377 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.424707 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.424984 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.425190 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.427310 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r"] Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.461175 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md4wj\" (UniqueName: \"kubernetes.io/projected/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-kube-api-access-md4wj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.461238 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.461314 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.563687 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md4wj\" (UniqueName: \"kubernetes.io/projected/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-kube-api-access-md4wj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.564162 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.564980 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.568087 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.570222 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.580629 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md4wj\" (UniqueName: \"kubernetes.io/projected/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-kube-api-access-md4wj\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:17 crc kubenswrapper[4907]: I0226 16:13:17.748479 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:13:18 crc kubenswrapper[4907]: I0226 16:13:18.249570 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r"] Feb 26 16:13:18 crc kubenswrapper[4907]: I0226 16:13:18.270377 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:13:18 crc kubenswrapper[4907]: I0226 16:13:18.302087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" event={"ID":"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc","Type":"ContainerStarted","Data":"a74e8c90000905bffe8c0a7a2a2cd6b77df7a16d3c2b8c01910958d8455d1c52"} Feb 26 16:13:18 crc kubenswrapper[4907]: I0226 16:13:18.740394 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:13:19 crc kubenswrapper[4907]: I0226 16:13:19.311222 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" event={"ID":"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc","Type":"ContainerStarted","Data":"5ae3114214b762fe854be97d1bf5ae65b6f7b3c6ee5cfcfbdca658e0534bed16"} Feb 26 16:13:19 crc kubenswrapper[4907]: I0226 16:13:19.334703 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" podStartSLOduration=1.8683157160000001 podStartE2EDuration="2.334683869s" podCreationTimestamp="2026-02-26 16:13:17 +0000 UTC" firstStartedPulling="2026-02-26 16:13:18.270108553 +0000 UTC m=+1860.788670422" lastFinishedPulling="2026-02-26 16:13:18.736476716 +0000 UTC m=+1861.255038575" observedRunningTime="2026-02-26 16:13:19.326087987 +0000 UTC m=+1861.844649836" watchObservedRunningTime="2026-02-26 16:13:19.334683869 +0000 UTC m=+1861.853245718" Feb 26 16:13:30 crc kubenswrapper[4907]: I0226 16:13:30.126831 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:13:30 crc kubenswrapper[4907]: E0226 16:13:30.127861 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:13:36 crc kubenswrapper[4907]: I0226 16:13:36.040055 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-ck6c5"] Feb 26 16:13:36 crc kubenswrapper[4907]: I0226 16:13:36.052131 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-ck6c5"] Feb 26 16:13:36 crc kubenswrapper[4907]: I0226 16:13:36.138336 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014002bc-6d56-41a9-969b-b6607aa0aa71" path="/var/lib/kubelet/pods/014002bc-6d56-41a9-969b-b6607aa0aa71/volumes" Feb 26 16:13:45 crc kubenswrapper[4907]: I0226 16:13:45.126499 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:13:45 crc kubenswrapper[4907]: E0226 16:13:45.127342 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:13:51 crc kubenswrapper[4907]: I0226 16:13:51.039057 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-0936-account-create-update-zlgpv"] Feb 26 16:13:51 crc kubenswrapper[4907]: I0226 16:13:51.051784 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-0936-account-create-update-zlgpv"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.034847 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-l68xw"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.047303 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-99c4-account-create-update-6jrmv"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.062045 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gcc4z"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.073882 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-76hml"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.086295 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-158b-account-create-update-7ht2q"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.096326 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-l68xw"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.108216 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gcc4z"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.118874 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-158b-account-create-update-7ht2q"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.140123 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be03d75-755c-40f4-a2f2-db8f9e99b082" path="/var/lib/kubelet/pods/5be03d75-755c-40f4-a2f2-db8f9e99b082/volumes" Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.141968 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7102918b-1c33-4b66-9767-fcf854b0f666" path="/var/lib/kubelet/pods/7102918b-1c33-4b66-9767-fcf854b0f666/volumes" Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.144564 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b73e0ebd-2208-4fb9-9b3a-215c75b5529d" path="/var/lib/kubelet/pods/b73e0ebd-2208-4fb9-9b3a-215c75b5529d/volumes" Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.148915 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd" path="/var/lib/kubelet/pods/f15ac55d-0e4b-46d0-9f5d-4e0e9b86e8fd/volumes" Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.149752 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-99c4-account-create-update-6jrmv"] Feb 26 16:13:52 crc kubenswrapper[4907]: I0226 16:13:52.149787 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-76hml"] Feb 26 16:13:53 crc kubenswrapper[4907]: I0226 16:13:53.033668 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hdzvj"] Feb 26 16:13:53 crc kubenswrapper[4907]: I0226 16:13:53.043635 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hdzvj"] Feb 26 16:13:54 crc kubenswrapper[4907]: I0226 16:13:54.137360 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2395dfd1-7840-4703-a1c9-37c6eff664bd" path="/var/lib/kubelet/pods/2395dfd1-7840-4703-a1c9-37c6eff664bd/volumes" Feb 26 16:13:54 crc kubenswrapper[4907]: I0226 16:13:54.139716 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b004f31d-2432-403e-a862-a640cb1fe5ad" path="/var/lib/kubelet/pods/b004f31d-2432-403e-a862-a640cb1fe5ad/volumes" Feb 26 16:13:54 crc kubenswrapper[4907]: I0226 16:13:54.141275 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2ad5709-5849-49e7-840d-8af9abef7abd" path="/var/lib/kubelet/pods/c2ad5709-5849-49e7-840d-8af9abef7abd/volumes" Feb 26 16:13:57 crc kubenswrapper[4907]: I0226 16:13:57.031119 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ts667"] Feb 26 16:13:57 crc kubenswrapper[4907]: I0226 16:13:57.047218 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ts667"] Feb 26 16:13:58 crc kubenswrapper[4907]: I0226 16:13:58.132855 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:13:58 crc kubenswrapper[4907]: I0226 16:13:58.139056 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94cb55c-878d-432f-ab95-4d0012359b2f" path="/var/lib/kubelet/pods/a94cb55c-878d-432f-ab95-4d0012359b2f/volumes" Feb 26 16:13:59 crc kubenswrapper[4907]: I0226 16:13:59.006197 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"eeafebf90768294d93b5a754d4be3f7e7e83781774c84e4b268744314a564bb2"} Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.155070 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535374-xvjqq"] Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.157803 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.161272 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.161688 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.161765 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.167850 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-xvjqq"] Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.241654 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl5b2\" (UniqueName: \"kubernetes.io/projected/877489c3-3906-4d08-b2cf-e3245aeeec08-kube-api-access-dl5b2\") pod \"auto-csr-approver-29535374-xvjqq\" (UID: \"877489c3-3906-4d08-b2cf-e3245aeeec08\") " pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.342915 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl5b2\" (UniqueName: \"kubernetes.io/projected/877489c3-3906-4d08-b2cf-e3245aeeec08-kube-api-access-dl5b2\") pod \"auto-csr-approver-29535374-xvjqq\" (UID: \"877489c3-3906-4d08-b2cf-e3245aeeec08\") " pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.366859 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl5b2\" (UniqueName: \"kubernetes.io/projected/877489c3-3906-4d08-b2cf-e3245aeeec08-kube-api-access-dl5b2\") pod \"auto-csr-approver-29535374-xvjqq\" (UID: \"877489c3-3906-4d08-b2cf-e3245aeeec08\") " pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.480287 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:00 crc kubenswrapper[4907]: I0226 16:14:00.966849 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-xvjqq"] Feb 26 16:14:00 crc kubenswrapper[4907]: W0226 16:14:00.969727 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877489c3_3906_4d08_b2cf_e3245aeeec08.slice/crio-3c7819cefb4e84264c1e61fdedee70e4ff32a5bdac292de12b5aac7754d81f21 WatchSource:0}: Error finding container 3c7819cefb4e84264c1e61fdedee70e4ff32a5bdac292de12b5aac7754d81f21: Status 404 returned error can't find the container with id 3c7819cefb4e84264c1e61fdedee70e4ff32a5bdac292de12b5aac7754d81f21 Feb 26 16:14:01 crc kubenswrapper[4907]: I0226 16:14:01.025788 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" event={"ID":"877489c3-3906-4d08-b2cf-e3245aeeec08","Type":"ContainerStarted","Data":"3c7819cefb4e84264c1e61fdedee70e4ff32a5bdac292de12b5aac7754d81f21"} Feb 26 16:14:03 crc kubenswrapper[4907]: I0226 16:14:03.046245 4907 generic.go:334] "Generic (PLEG): container finished" podID="877489c3-3906-4d08-b2cf-e3245aeeec08" containerID="fafd9fbd6f5d5ee7d53958c894d99d720d9b3d52248b7e865eb306bbe5213097" exitCode=0 Feb 26 16:14:03 crc kubenswrapper[4907]: I0226 16:14:03.046422 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" event={"ID":"877489c3-3906-4d08-b2cf-e3245aeeec08","Type":"ContainerDied","Data":"fafd9fbd6f5d5ee7d53958c894d99d720d9b3d52248b7e865eb306bbe5213097"} Feb 26 16:14:04 crc kubenswrapper[4907]: I0226 16:14:04.418422 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:04 crc kubenswrapper[4907]: I0226 16:14:04.522181 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl5b2\" (UniqueName: \"kubernetes.io/projected/877489c3-3906-4d08-b2cf-e3245aeeec08-kube-api-access-dl5b2\") pod \"877489c3-3906-4d08-b2cf-e3245aeeec08\" (UID: \"877489c3-3906-4d08-b2cf-e3245aeeec08\") " Feb 26 16:14:04 crc kubenswrapper[4907]: I0226 16:14:04.538294 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877489c3-3906-4d08-b2cf-e3245aeeec08-kube-api-access-dl5b2" (OuterVolumeSpecName: "kube-api-access-dl5b2") pod "877489c3-3906-4d08-b2cf-e3245aeeec08" (UID: "877489c3-3906-4d08-b2cf-e3245aeeec08"). InnerVolumeSpecName "kube-api-access-dl5b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:14:04 crc kubenswrapper[4907]: I0226 16:14:04.625390 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl5b2\" (UniqueName: \"kubernetes.io/projected/877489c3-3906-4d08-b2cf-e3245aeeec08-kube-api-access-dl5b2\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:05 crc kubenswrapper[4907]: I0226 16:14:05.066197 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" event={"ID":"877489c3-3906-4d08-b2cf-e3245aeeec08","Type":"ContainerDied","Data":"3c7819cefb4e84264c1e61fdedee70e4ff32a5bdac292de12b5aac7754d81f21"} Feb 26 16:14:05 crc kubenswrapper[4907]: I0226 16:14:05.066237 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c7819cefb4e84264c1e61fdedee70e4ff32a5bdac292de12b5aac7754d81f21" Feb 26 16:14:05 crc kubenswrapper[4907]: I0226 16:14:05.066256 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535374-xvjqq" Feb 26 16:14:05 crc kubenswrapper[4907]: I0226 16:14:05.485017 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-lf7wr"] Feb 26 16:14:05 crc kubenswrapper[4907]: I0226 16:14:05.493855 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535368-lf7wr"] Feb 26 16:14:06 crc kubenswrapper[4907]: I0226 16:14:06.143385 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9dc4728-d6d1-46da-a355-56e615849c42" path="/var/lib/kubelet/pods/d9dc4728-d6d1-46da-a355-56e615849c42/volumes" Feb 26 16:14:08 crc kubenswrapper[4907]: I0226 16:14:08.797684 4907 scope.go:117] "RemoveContainer" containerID="63496136caf0de20beb55d60c8d05550ef8b6597390d822d2b0105cdf73169db" Feb 26 16:14:08 crc kubenswrapper[4907]: I0226 16:14:08.843468 4907 scope.go:117] "RemoveContainer" containerID="e27511ed608bc4ca90dd647d2b05bd19aa57020996a4b02c6fff9a84c4b76f67" Feb 26 16:14:08 crc kubenswrapper[4907]: I0226 16:14:08.890303 4907 scope.go:117] "RemoveContainer" containerID="0edc709be2aa57fb78580ed2af925f93ac7a5217c56cc184072271a2659984f1" Feb 26 16:14:08 crc kubenswrapper[4907]: I0226 16:14:08.964315 4907 scope.go:117] "RemoveContainer" containerID="a1a1dd5695278af93e861a3ea7588d623b20b9e11cf6f02ef672ebd6fca0c916" Feb 26 16:14:08 crc kubenswrapper[4907]: I0226 16:14:08.983772 4907 scope.go:117] "RemoveContainer" containerID="5f933d2fb049836b0e3d0ab4080dcbda877f37d4f2ad6b0b4517870c399eabd7" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.027090 4907 scope.go:117] "RemoveContainer" containerID="32a8271fcc6373ab8b6b5d29e8bd868bdc62ced4a2431c88da81b21b00a0e2ce" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.064764 4907 scope.go:117] "RemoveContainer" containerID="dbd2eb33a1d2116ef8899a4f58d3fbd0214225f67626da56f54e85f5097397f7" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.096239 4907 scope.go:117] "RemoveContainer" containerID="bbf1de5304c8f42ea1279b391b8a34c5bbd6a2869bf612714dc862741f69423b" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.123796 4907 scope.go:117] "RemoveContainer" containerID="ced4fc2d497ca439acbf014da0f32226afff2cd2aedba7c0f85f2022046ae4ce" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.155685 4907 scope.go:117] "RemoveContainer" containerID="ee695ccda4a1b1f3bc05584ebde571beb2df00482ac669326b469f4ada49196a" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.181156 4907 scope.go:117] "RemoveContainer" containerID="1f63ad42f27882a910cb8d3eb08bdc45289834d31a44f751e36a0e633437d378" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.210425 4907 scope.go:117] "RemoveContainer" containerID="27a7050b18821bdebe8a05d687a319a520957b7cf8557a85bcfcf81d86e840f9" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.230669 4907 scope.go:117] "RemoveContainer" containerID="6766438333efc1a4d8c6775d8147c22732f1b866f44239ba70742d36c98778fd" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.254441 4907 scope.go:117] "RemoveContainer" containerID="276138739de3bc24bafc20c64d390be58e812ae9c834c069e28fd935a61d34f1" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.274513 4907 scope.go:117] "RemoveContainer" containerID="2fd168ac0f799338bd20e6d5e4ed5ac9c085478239e19b247e17ab967f5d4bf7" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.296569 4907 scope.go:117] "RemoveContainer" containerID="08cabaedf40630d23f28757a7e22a68606ac3ac825a025914a7d4c3be249cba6" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.316723 4907 scope.go:117] "RemoveContainer" containerID="babb49a6c58bcde03d6ff5c34e9363d88091d1b64393f27bd898d3a62c24de05" Feb 26 16:14:09 crc kubenswrapper[4907]: I0226 16:14:09.345561 4907 scope.go:117] "RemoveContainer" containerID="7f20ff2fec931032c61472fe98aee531571286b0527d9e808df5612acce2a74f" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.603174 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9c68v"] Feb 26 16:14:19 crc kubenswrapper[4907]: E0226 16:14:19.604220 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877489c3-3906-4d08-b2cf-e3245aeeec08" containerName="oc" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.604237 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="877489c3-3906-4d08-b2cf-e3245aeeec08" containerName="oc" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.604468 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="877489c3-3906-4d08-b2cf-e3245aeeec08" containerName="oc" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.606204 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.623835 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9c68v"] Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.712786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-utilities\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.712829 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z5jg\" (UniqueName: \"kubernetes.io/projected/7d6e77b4-4814-412f-9536-ce274da693d7-kube-api-access-4z5jg\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.712858 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-catalog-content\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.815058 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-utilities\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.815421 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z5jg\" (UniqueName: \"kubernetes.io/projected/7d6e77b4-4814-412f-9536-ce274da693d7-kube-api-access-4z5jg\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.815460 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-catalog-content\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.815763 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-utilities\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.815953 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-catalog-content\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.846914 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z5jg\" (UniqueName: \"kubernetes.io/projected/7d6e77b4-4814-412f-9536-ce274da693d7-kube-api-access-4z5jg\") pod \"community-operators-9c68v\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:19 crc kubenswrapper[4907]: I0226 16:14:19.931131 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:20 crc kubenswrapper[4907]: I0226 16:14:20.692376 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9c68v"] Feb 26 16:14:21 crc kubenswrapper[4907]: I0226 16:14:21.238862 4907 generic.go:334] "Generic (PLEG): container finished" podID="7d6e77b4-4814-412f-9536-ce274da693d7" containerID="832216ac667042031962be8e9b85ebc0b630d1d879932221297aad9b722634cb" exitCode=0 Feb 26 16:14:21 crc kubenswrapper[4907]: I0226 16:14:21.239164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerDied","Data":"832216ac667042031962be8e9b85ebc0b630d1d879932221297aad9b722634cb"} Feb 26 16:14:21 crc kubenswrapper[4907]: I0226 16:14:21.239194 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerStarted","Data":"94e275c80a49511062d42f94ce117ce53941ac69d1eb51c3db561d820a26eb1b"} Feb 26 16:14:23 crc kubenswrapper[4907]: I0226 16:14:23.438768 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerStarted","Data":"e6e326fbf79f695e16cbab9a8e7fc0bf11f371d87cd9907e25bb3a579d2b7bcc"} Feb 26 16:14:27 crc kubenswrapper[4907]: I0226 16:14:27.520340 4907 generic.go:334] "Generic (PLEG): container finished" podID="7d6e77b4-4814-412f-9536-ce274da693d7" containerID="e6e326fbf79f695e16cbab9a8e7fc0bf11f371d87cd9907e25bb3a579d2b7bcc" exitCode=0 Feb 26 16:14:27 crc kubenswrapper[4907]: I0226 16:14:27.520449 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerDied","Data":"e6e326fbf79f695e16cbab9a8e7fc0bf11f371d87cd9907e25bb3a579d2b7bcc"} Feb 26 16:14:28 crc kubenswrapper[4907]: I0226 16:14:28.532959 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerStarted","Data":"88c4c3820200e724242b561e646a07ae4a13eec0d0088fe07deca1baac20bc64"} Feb 26 16:14:28 crc kubenswrapper[4907]: I0226 16:14:28.560695 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9c68v" podStartSLOduration=2.8242675999999998 podStartE2EDuration="9.560670737s" podCreationTimestamp="2026-02-26 16:14:19 +0000 UTC" firstStartedPulling="2026-02-26 16:14:21.245912697 +0000 UTC m=+1923.764474546" lastFinishedPulling="2026-02-26 16:14:27.982315834 +0000 UTC m=+1930.500877683" observedRunningTime="2026-02-26 16:14:28.557993402 +0000 UTC m=+1931.076555261" watchObservedRunningTime="2026-02-26 16:14:28.560670737 +0000 UTC m=+1931.079232626" Feb 26 16:14:29 crc kubenswrapper[4907]: I0226 16:14:29.932169 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:29 crc kubenswrapper[4907]: I0226 16:14:29.932511 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:30 crc kubenswrapper[4907]: I0226 16:14:30.989860 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9c68v" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="registry-server" probeResult="failure" output=< Feb 26 16:14:30 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:14:30 crc kubenswrapper[4907]: > Feb 26 16:14:40 crc kubenswrapper[4907]: I0226 16:14:40.014428 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:40 crc kubenswrapper[4907]: I0226 16:14:40.085095 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:40 crc kubenswrapper[4907]: I0226 16:14:40.266325 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9c68v"] Feb 26 16:14:41 crc kubenswrapper[4907]: I0226 16:14:41.651882 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9c68v" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="registry-server" containerID="cri-o://88c4c3820200e724242b561e646a07ae4a13eec0d0088fe07deca1baac20bc64" gracePeriod=2 Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.662916 4907 generic.go:334] "Generic (PLEG): container finished" podID="7d6e77b4-4814-412f-9536-ce274da693d7" containerID="88c4c3820200e724242b561e646a07ae4a13eec0d0088fe07deca1baac20bc64" exitCode=0 Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.663234 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerDied","Data":"88c4c3820200e724242b561e646a07ae4a13eec0d0088fe07deca1baac20bc64"} Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.796225 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.812421 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z5jg\" (UniqueName: \"kubernetes.io/projected/7d6e77b4-4814-412f-9536-ce274da693d7-kube-api-access-4z5jg\") pod \"7d6e77b4-4814-412f-9536-ce274da693d7\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.812531 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-catalog-content\") pod \"7d6e77b4-4814-412f-9536-ce274da693d7\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.812648 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-utilities\") pod \"7d6e77b4-4814-412f-9536-ce274da693d7\" (UID: \"7d6e77b4-4814-412f-9536-ce274da693d7\") " Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.813850 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-utilities" (OuterVolumeSpecName: "utilities") pod "7d6e77b4-4814-412f-9536-ce274da693d7" (UID: "7d6e77b4-4814-412f-9536-ce274da693d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.832044 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d6e77b4-4814-412f-9536-ce274da693d7-kube-api-access-4z5jg" (OuterVolumeSpecName: "kube-api-access-4z5jg") pod "7d6e77b4-4814-412f-9536-ce274da693d7" (UID: "7d6e77b4-4814-412f-9536-ce274da693d7"). InnerVolumeSpecName "kube-api-access-4z5jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.915302 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.915417 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z5jg\" (UniqueName: \"kubernetes.io/projected/7d6e77b4-4814-412f-9536-ce274da693d7-kube-api-access-4z5jg\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:42 crc kubenswrapper[4907]: I0226 16:14:42.916357 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d6e77b4-4814-412f-9536-ce274da693d7" (UID: "7d6e77b4-4814-412f-9536-ce274da693d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.017851 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d6e77b4-4814-412f-9536-ce274da693d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.674435 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9c68v" event={"ID":"7d6e77b4-4814-412f-9536-ce274da693d7","Type":"ContainerDied","Data":"94e275c80a49511062d42f94ce117ce53941ac69d1eb51c3db561d820a26eb1b"} Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.674498 4907 scope.go:117] "RemoveContainer" containerID="88c4c3820200e724242b561e646a07ae4a13eec0d0088fe07deca1baac20bc64" Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.674512 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9c68v" Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.695122 4907 scope.go:117] "RemoveContainer" containerID="e6e326fbf79f695e16cbab9a8e7fc0bf11f371d87cd9907e25bb3a579d2b7bcc" Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.720779 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9c68v"] Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.726573 4907 scope.go:117] "RemoveContainer" containerID="832216ac667042031962be8e9b85ebc0b630d1d879932221297aad9b722634cb" Feb 26 16:14:43 crc kubenswrapper[4907]: I0226 16:14:43.726796 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9c68v"] Feb 26 16:14:44 crc kubenswrapper[4907]: I0226 16:14:44.137404 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" path="/var/lib/kubelet/pods/7d6e77b4-4814-412f-9536-ce274da693d7/volumes" Feb 26 16:14:53 crc kubenswrapper[4907]: I0226 16:14:53.046621 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sg95t"] Feb 26 16:14:53 crc kubenswrapper[4907]: I0226 16:14:53.055342 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sg95t"] Feb 26 16:14:54 crc kubenswrapper[4907]: I0226 16:14:54.139998 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ae29e7c-7f4a-492f-b10d-2badd4d606aa" path="/var/lib/kubelet/pods/1ae29e7c-7f4a-492f-b10d-2badd4d606aa/volumes" Feb 26 16:14:54 crc kubenswrapper[4907]: I0226 16:14:54.772664 4907 generic.go:334] "Generic (PLEG): container finished" podID="9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" containerID="5ae3114214b762fe854be97d1bf5ae65b6f7b3c6ee5cfcfbdca658e0534bed16" exitCode=0 Feb 26 16:14:54 crc kubenswrapper[4907]: I0226 16:14:54.772849 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" event={"ID":"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc","Type":"ContainerDied","Data":"5ae3114214b762fe854be97d1bf5ae65b6f7b3c6ee5cfcfbdca658e0534bed16"} Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.255872 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.413980 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-ssh-key-openstack-edpm-ipam\") pod \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.414342 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md4wj\" (UniqueName: \"kubernetes.io/projected/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-kube-api-access-md4wj\") pod \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.414509 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-inventory\") pod \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\" (UID: \"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc\") " Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.433419 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-kube-api-access-md4wj" (OuterVolumeSpecName: "kube-api-access-md4wj") pod "9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" (UID: "9c764e34-e690-4b9f-aae5-9ea7ccacd4fc"). InnerVolumeSpecName "kube-api-access-md4wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.441446 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" (UID: "9c764e34-e690-4b9f-aae5-9ea7ccacd4fc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.445465 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-inventory" (OuterVolumeSpecName: "inventory") pod "9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" (UID: "9c764e34-e690-4b9f-aae5-9ea7ccacd4fc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.516477 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.516517 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md4wj\" (UniqueName: \"kubernetes.io/projected/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-kube-api-access-md4wj\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.516529 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c764e34-e690-4b9f-aae5-9ea7ccacd4fc-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.794183 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" event={"ID":"9c764e34-e690-4b9f-aae5-9ea7ccacd4fc","Type":"ContainerDied","Data":"a74e8c90000905bffe8c0a7a2a2cd6b77df7a16d3c2b8c01910958d8455d1c52"} Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.794526 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74e8c90000905bffe8c0a7a2a2cd6b77df7a16d3c2b8c01910958d8455d1c52" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.794463 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.898999 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh"] Feb 26 16:14:56 crc kubenswrapper[4907]: E0226 16:14:56.899488 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="extract-content" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.899503 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="extract-content" Feb 26 16:14:56 crc kubenswrapper[4907]: E0226 16:14:56.899525 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.899534 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 16:14:56 crc kubenswrapper[4907]: E0226 16:14:56.899551 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="extract-utilities" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.899560 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="extract-utilities" Feb 26 16:14:56 crc kubenswrapper[4907]: E0226 16:14:56.899568 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="registry-server" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.899575 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="registry-server" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.899988 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c764e34-e690-4b9f-aae5-9ea7ccacd4fc" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.900073 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d6e77b4-4814-412f-9536-ce274da693d7" containerName="registry-server" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.903996 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.907064 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.907278 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.907508 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.907704 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:14:56 crc kubenswrapper[4907]: I0226 16:14:56.911962 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh"] Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.026312 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.026407 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.026539 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsb8h\" (UniqueName: \"kubernetes.io/projected/f7ab7062-024c-462c-99a7-4c3f6f27e471-kube-api-access-nsb8h\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.128248 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsb8h\" (UniqueName: \"kubernetes.io/projected/f7ab7062-024c-462c-99a7-4c3f6f27e471-kube-api-access-nsb8h\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.128899 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.129024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.134454 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.142634 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.147796 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsb8h\" (UniqueName: \"kubernetes.io/projected/f7ab7062-024c-462c-99a7-4c3f6f27e471-kube-api-access-nsb8h\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.243190 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:14:57 crc kubenswrapper[4907]: I0226 16:14:57.799748 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh"] Feb 26 16:14:58 crc kubenswrapper[4907]: I0226 16:14:58.813568 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" event={"ID":"f7ab7062-024c-462c-99a7-4c3f6f27e471","Type":"ContainerStarted","Data":"320c25da003d82759352b5dbaae7a0eb645aaabac3422bbfe2fac5c39f12b4e7"} Feb 26 16:14:58 crc kubenswrapper[4907]: I0226 16:14:58.813630 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" event={"ID":"f7ab7062-024c-462c-99a7-4c3f6f27e471","Type":"ContainerStarted","Data":"a3371cba62ba3f128ea74b0f0193a62eb004c2db77b0990f2bf8cf877715bc57"} Feb 26 16:14:58 crc kubenswrapper[4907]: I0226 16:14:58.828162 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" podStartSLOduration=2.396782858 podStartE2EDuration="2.828146381s" podCreationTimestamp="2026-02-26 16:14:56 +0000 UTC" firstStartedPulling="2026-02-26 16:14:57.824618931 +0000 UTC m=+1960.343180780" lastFinishedPulling="2026-02-26 16:14:58.255982454 +0000 UTC m=+1960.774544303" observedRunningTime="2026-02-26 16:14:58.82727477 +0000 UTC m=+1961.345836629" watchObservedRunningTime="2026-02-26 16:14:58.828146381 +0000 UTC m=+1961.346708230" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.207680 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z"] Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.209576 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.215218 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z"] Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.222694 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.222997 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.309020 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-secret-volume\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.309096 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-config-volume\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.309140 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn6x\" (UniqueName: \"kubernetes.io/projected/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-kube-api-access-hsn6x\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.410811 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-secret-volume\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.410879 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-config-volume\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.410922 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn6x\" (UniqueName: \"kubernetes.io/projected/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-kube-api-access-hsn6x\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.411898 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-config-volume\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.416060 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-secret-volume\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.439209 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn6x\" (UniqueName: \"kubernetes.io/projected/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-kube-api-access-hsn6x\") pod \"collect-profiles-29535375-ww29z\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:00 crc kubenswrapper[4907]: I0226 16:15:00.570341 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.014358 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z"] Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.075794 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-6t72w"] Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.099057 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dwb5n"] Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.119109 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-6t72w"] Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.126933 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dwb5n"] Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.846692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" event={"ID":"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae","Type":"ContainerStarted","Data":"248a272bbd15fa2821cb25ebd97cdc750b0f0a313b4d49bf7e2ea4d0832feff3"} Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.847072 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" event={"ID":"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae","Type":"ContainerStarted","Data":"0790fce9c74324b2b0683b19059d29d5a6732c68388f0b4d10f8ce5f70c6f054"} Feb 26 16:15:01 crc kubenswrapper[4907]: I0226 16:15:01.865809 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" podStartSLOduration=1.865786356 podStartE2EDuration="1.865786356s" podCreationTimestamp="2026-02-26 16:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:15:01.860926748 +0000 UTC m=+1964.379488597" watchObservedRunningTime="2026-02-26 16:15:01.865786356 +0000 UTC m=+1964.384348205" Feb 26 16:15:02 crc kubenswrapper[4907]: I0226 16:15:02.139061 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a" path="/var/lib/kubelet/pods/4cbb7c75-3f73-4181-b214-cdfb8d9ffd9a/volumes" Feb 26 16:15:02 crc kubenswrapper[4907]: I0226 16:15:02.142453 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0a55626-b305-4e22-aec1-24832bec9a9f" path="/var/lib/kubelet/pods/e0a55626-b305-4e22-aec1-24832bec9a9f/volumes" Feb 26 16:15:02 crc kubenswrapper[4907]: I0226 16:15:02.856992 4907 generic.go:334] "Generic (PLEG): container finished" podID="1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" containerID="248a272bbd15fa2821cb25ebd97cdc750b0f0a313b4d49bf7e2ea4d0832feff3" exitCode=0 Feb 26 16:15:02 crc kubenswrapper[4907]: I0226 16:15:02.857035 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" event={"ID":"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae","Type":"ContainerDied","Data":"248a272bbd15fa2821cb25ebd97cdc750b0f0a313b4d49bf7e2ea4d0832feff3"} Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.151512 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.289636 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-config-volume\") pod \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.289837 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-secret-volume\") pod \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.289976 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsn6x\" (UniqueName: \"kubernetes.io/projected/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-kube-api-access-hsn6x\") pod \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\" (UID: \"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae\") " Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.291062 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-config-volume" (OuterVolumeSpecName: "config-volume") pod "1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" (UID: "1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.298522 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-kube-api-access-hsn6x" (OuterVolumeSpecName: "kube-api-access-hsn6x") pod "1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" (UID: "1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae"). InnerVolumeSpecName "kube-api-access-hsn6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.298936 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" (UID: "1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.392269 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsn6x\" (UniqueName: \"kubernetes.io/projected/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-kube-api-access-hsn6x\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.392316 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.392333 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.887301 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" event={"ID":"1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae","Type":"ContainerDied","Data":"0790fce9c74324b2b0683b19059d29d5a6732c68388f0b4d10f8ce5f70c6f054"} Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.887351 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0790fce9c74324b2b0683b19059d29d5a6732c68388f0b4d10f8ce5f70c6f054" Feb 26 16:15:04 crc kubenswrapper[4907]: I0226 16:15:04.887425 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535375-ww29z" Feb 26 16:15:09 crc kubenswrapper[4907]: I0226 16:15:09.663867 4907 scope.go:117] "RemoveContainer" containerID="7d8384c340f47ec5e939b3ead6a4f8659accead8c6b60732d40de8114e9324a0" Feb 26 16:15:09 crc kubenswrapper[4907]: I0226 16:15:09.698078 4907 scope.go:117] "RemoveContainer" containerID="dab876c894465becdf3105b0fa5d4916964a7cd76e622ecfca3a9597922b7d13" Feb 26 16:15:09 crc kubenswrapper[4907]: I0226 16:15:09.737995 4907 scope.go:117] "RemoveContainer" containerID="a6723ee64be82b655a1b30a04dd34c73206edf2a85f7fc50689b6d1a1d6df10a" Feb 26 16:15:27 crc kubenswrapper[4907]: I0226 16:15:27.044401 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-slrvx"] Feb 26 16:15:27 crc kubenswrapper[4907]: I0226 16:15:27.052801 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-slrvx"] Feb 26 16:15:28 crc kubenswrapper[4907]: I0226 16:15:28.138494 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a02d2622-77ed-4949-95b5-4f5ae5f1c47d" path="/var/lib/kubelet/pods/a02d2622-77ed-4949-95b5-4f5ae5f1c47d/volumes" Feb 26 16:15:29 crc kubenswrapper[4907]: I0226 16:15:29.027865 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xvvbl"] Feb 26 16:15:29 crc kubenswrapper[4907]: I0226 16:15:29.038790 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xvvbl"] Feb 26 16:15:30 crc kubenswrapper[4907]: I0226 16:15:30.140099 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98fd629-273b-4c87-a07c-4a482064a5a3" path="/var/lib/kubelet/pods/c98fd629-273b-4c87-a07c-4a482064a5a3/volumes" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.146464 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535376-m6v7w"] Feb 26 16:16:00 crc kubenswrapper[4907]: E0226 16:16:00.148916 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" containerName="collect-profiles" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.149287 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" containerName="collect-profiles" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.149667 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a45e1eb-d563-4c1a-9b06-ee9d2616d0ae" containerName="collect-profiles" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.150655 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.155669 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.155794 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.158400 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.167976 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-m6v7w"] Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.225331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcxpd\" (UniqueName: \"kubernetes.io/projected/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a-kube-api-access-zcxpd\") pod \"auto-csr-approver-29535376-m6v7w\" (UID: \"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a\") " pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.327062 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcxpd\" (UniqueName: \"kubernetes.io/projected/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a-kube-api-access-zcxpd\") pod \"auto-csr-approver-29535376-m6v7w\" (UID: \"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a\") " pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.353896 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcxpd\" (UniqueName: \"kubernetes.io/projected/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a-kube-api-access-zcxpd\") pod \"auto-csr-approver-29535376-m6v7w\" (UID: \"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a\") " pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:00 crc kubenswrapper[4907]: I0226 16:16:00.474071 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:01 crc kubenswrapper[4907]: I0226 16:16:01.007975 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-m6v7w"] Feb 26 16:16:01 crc kubenswrapper[4907]: I0226 16:16:01.382087 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" event={"ID":"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a","Type":"ContainerStarted","Data":"f6fcf4713eab2d8ea36b1d6701e8d40f2aeccc91e3e621e24817d3c491bda8e4"} Feb 26 16:16:02 crc kubenswrapper[4907]: I0226 16:16:02.401042 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" event={"ID":"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a","Type":"ContainerStarted","Data":"baddc805566013c4c6da03687ebbcaa1e817bd9c35691d103fa12cefebb69abd"} Feb 26 16:16:03 crc kubenswrapper[4907]: I0226 16:16:03.410675 4907 generic.go:334] "Generic (PLEG): container finished" podID="346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a" containerID="baddc805566013c4c6da03687ebbcaa1e817bd9c35691d103fa12cefebb69abd" exitCode=0 Feb 26 16:16:03 crc kubenswrapper[4907]: I0226 16:16:03.410777 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" event={"ID":"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a","Type":"ContainerDied","Data":"baddc805566013c4c6da03687ebbcaa1e817bd9c35691d103fa12cefebb69abd"} Feb 26 16:16:03 crc kubenswrapper[4907]: I0226 16:16:03.735486 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:03 crc kubenswrapper[4907]: I0226 16:16:03.795728 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcxpd\" (UniqueName: \"kubernetes.io/projected/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a-kube-api-access-zcxpd\") pod \"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a\" (UID: \"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a\") " Feb 26 16:16:03 crc kubenswrapper[4907]: I0226 16:16:03.805922 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a-kube-api-access-zcxpd" (OuterVolumeSpecName: "kube-api-access-zcxpd") pod "346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a" (UID: "346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a"). InnerVolumeSpecName "kube-api-access-zcxpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:16:03 crc kubenswrapper[4907]: I0226 16:16:03.898308 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcxpd\" (UniqueName: \"kubernetes.io/projected/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a-kube-api-access-zcxpd\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:04 crc kubenswrapper[4907]: I0226 16:16:04.420937 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" event={"ID":"346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a","Type":"ContainerDied","Data":"f6fcf4713eab2d8ea36b1d6701e8d40f2aeccc91e3e621e24817d3c491bda8e4"} Feb 26 16:16:04 crc kubenswrapper[4907]: I0226 16:16:04.420985 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6fcf4713eab2d8ea36b1d6701e8d40f2aeccc91e3e621e24817d3c491bda8e4" Feb 26 16:16:04 crc kubenswrapper[4907]: I0226 16:16:04.422238 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535376-m6v7w" Feb 26 16:16:04 crc kubenswrapper[4907]: I0226 16:16:04.805944 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zccfm"] Feb 26 16:16:04 crc kubenswrapper[4907]: I0226 16:16:04.813220 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535370-zccfm"] Feb 26 16:16:06 crc kubenswrapper[4907]: I0226 16:16:06.140163 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f78c496-8bc4-46c1-921d-1cdd80305a4b" path="/var/lib/kubelet/pods/5f78c496-8bc4-46c1-921d-1cdd80305a4b/volumes" Feb 26 16:16:09 crc kubenswrapper[4907]: I0226 16:16:09.866849 4907 scope.go:117] "RemoveContainer" containerID="0b6026eb615d38ba839f0ba2755147d5e3528ad58900bc626575611dfdbfdd95" Feb 26 16:16:09 crc kubenswrapper[4907]: I0226 16:16:09.898202 4907 scope.go:117] "RemoveContainer" containerID="e0735ae758d881be5c47962c25e0c5beb10fc882e59ecc8f9b0a5ceff33abcb6" Feb 26 16:16:09 crc kubenswrapper[4907]: I0226 16:16:09.948779 4907 scope.go:117] "RemoveContainer" containerID="6ebcd10ca8374898bddc79455029b3e3d94bf890a8b3f68c17e5b18a05348826" Feb 26 16:16:11 crc kubenswrapper[4907]: I0226 16:16:11.475755 4907 generic.go:334] "Generic (PLEG): container finished" podID="f7ab7062-024c-462c-99a7-4c3f6f27e471" containerID="320c25da003d82759352b5dbaae7a0eb645aaabac3422bbfe2fac5c39f12b4e7" exitCode=0 Feb 26 16:16:11 crc kubenswrapper[4907]: I0226 16:16:11.475835 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" event={"ID":"f7ab7062-024c-462c-99a7-4c3f6f27e471","Type":"ContainerDied","Data":"320c25da003d82759352b5dbaae7a0eb645aaabac3422bbfe2fac5c39f12b4e7"} Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.031717 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-k8vd5"] Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.040948 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-pn8lr"] Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.049636 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-pn8lr"] Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.057060 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-k8vd5"] Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.136987 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693a0231-a18d-4141-a46f-5911644101a4" path="/var/lib/kubelet/pods/693a0231-a18d-4141-a46f-5911644101a4/volumes" Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.137901 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea" path="/var/lib/kubelet/pods/835cf533-cc08-4ce6-b0e1-ed3a8a2a88ea/volumes" Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.904869 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.972494 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsb8h\" (UniqueName: \"kubernetes.io/projected/f7ab7062-024c-462c-99a7-4c3f6f27e471-kube-api-access-nsb8h\") pod \"f7ab7062-024c-462c-99a7-4c3f6f27e471\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.972923 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-ssh-key-openstack-edpm-ipam\") pod \"f7ab7062-024c-462c-99a7-4c3f6f27e471\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.973211 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-inventory\") pod \"f7ab7062-024c-462c-99a7-4c3f6f27e471\" (UID: \"f7ab7062-024c-462c-99a7-4c3f6f27e471\") " Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.977947 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ab7062-024c-462c-99a7-4c3f6f27e471-kube-api-access-nsb8h" (OuterVolumeSpecName: "kube-api-access-nsb8h") pod "f7ab7062-024c-462c-99a7-4c3f6f27e471" (UID: "f7ab7062-024c-462c-99a7-4c3f6f27e471"). InnerVolumeSpecName "kube-api-access-nsb8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:16:12 crc kubenswrapper[4907]: I0226 16:16:12.997118 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7ab7062-024c-462c-99a7-4c3f6f27e471" (UID: "f7ab7062-024c-462c-99a7-4c3f6f27e471"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.005092 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-inventory" (OuterVolumeSpecName: "inventory") pod "f7ab7062-024c-462c-99a7-4c3f6f27e471" (UID: "f7ab7062-024c-462c-99a7-4c3f6f27e471"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.034086 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2e48-account-create-update-8mvk9"] Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.047363 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0586-account-create-update-p76kt"] Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.060162 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0586-account-create-update-p76kt"] Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.075860 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2e48-account-create-update-8mvk9"] Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.075943 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.075969 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsb8h\" (UniqueName: \"kubernetes.io/projected/f7ab7062-024c-462c-99a7-4c3f6f27e471-kube-api-access-nsb8h\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.075983 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7ab7062-024c-462c-99a7-4c3f6f27e471-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.492837 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" event={"ID":"f7ab7062-024c-462c-99a7-4c3f6f27e471","Type":"ContainerDied","Data":"a3371cba62ba3f128ea74b0f0193a62eb004c2db77b0990f2bf8cf877715bc57"} Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.492874 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.492876 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3371cba62ba3f128ea74b0f0193a62eb004c2db77b0990f2bf8cf877715bc57" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.578780 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6"] Feb 26 16:16:13 crc kubenswrapper[4907]: E0226 16:16:13.579348 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a" containerName="oc" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.579372 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a" containerName="oc" Feb 26 16:16:13 crc kubenswrapper[4907]: E0226 16:16:13.579395 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ab7062-024c-462c-99a7-4c3f6f27e471" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.579405 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ab7062-024c-462c-99a7-4c3f6f27e471" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.579658 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ab7062-024c-462c-99a7-4c3f6f27e471" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.579691 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a" containerName="oc" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.580856 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.583961 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.584126 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.585481 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.585606 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.591156 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6"] Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.687938 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.688201 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fx8k\" (UniqueName: \"kubernetes.io/projected/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-kube-api-access-5fx8k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.688376 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.790286 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.790410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.790441 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fx8k\" (UniqueName: \"kubernetes.io/projected/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-kube-api-access-5fx8k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.795966 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.801110 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.815455 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fx8k\" (UniqueName: \"kubernetes.io/projected/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-kube-api-access-5fx8k\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-blwb6\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:13 crc kubenswrapper[4907]: I0226 16:16:13.934924 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.042794 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-5hgql"] Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.059280 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-870e-account-create-update-4v7m7"] Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.069725 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-5hgql"] Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.079733 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-870e-account-create-update-4v7m7"] Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.139182 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b61b535-465a-4786-bba7-c33c3c5672a7" path="/var/lib/kubelet/pods/1b61b535-465a-4786-bba7-c33c3c5672a7/volumes" Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.140086 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9022005c-a270-4ad2-b526-10bb125dfff3" path="/var/lib/kubelet/pods/9022005c-a270-4ad2-b526-10bb125dfff3/volumes" Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.140759 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b617db-d4f3-448a-b544-0cd38d51728b" path="/var/lib/kubelet/pods/e3b617db-d4f3-448a-b544-0cd38d51728b/volumes" Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.141796 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e97b768b-99a2-4a89-b88e-e5ccbbf8d23f" path="/var/lib/kubelet/pods/e97b768b-99a2-4a89-b88e-e5ccbbf8d23f/volumes" Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.468771 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6"] Feb 26 16:16:14 crc kubenswrapper[4907]: I0226 16:16:14.512410 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" event={"ID":"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede","Type":"ContainerStarted","Data":"307c5a9b61a987fad1fd69d926bb54bc4ce4b85963ac1f3abbfe660b3d7b05f2"} Feb 26 16:16:15 crc kubenswrapper[4907]: I0226 16:16:15.524848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" event={"ID":"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede","Type":"ContainerStarted","Data":"908f0905d9d7844d8bdc52f0285f095680e9fe1daa2db0d452e743bed012d919"} Feb 26 16:16:15 crc kubenswrapper[4907]: I0226 16:16:15.543703 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" podStartSLOduration=2.028402575 podStartE2EDuration="2.543686226s" podCreationTimestamp="2026-02-26 16:16:13 +0000 UTC" firstStartedPulling="2026-02-26 16:16:14.477049847 +0000 UTC m=+2036.995611696" lastFinishedPulling="2026-02-26 16:16:14.992333498 +0000 UTC m=+2037.510895347" observedRunningTime="2026-02-26 16:16:15.539071833 +0000 UTC m=+2038.057633682" watchObservedRunningTime="2026-02-26 16:16:15.543686226 +0000 UTC m=+2038.062248075" Feb 26 16:16:18 crc kubenswrapper[4907]: I0226 16:16:18.530239 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:16:18 crc kubenswrapper[4907]: I0226 16:16:18.530824 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:16:19 crc kubenswrapper[4907]: I0226 16:16:19.560062 4907 generic.go:334] "Generic (PLEG): container finished" podID="93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" containerID="908f0905d9d7844d8bdc52f0285f095680e9fe1daa2db0d452e743bed012d919" exitCode=0 Feb 26 16:16:19 crc kubenswrapper[4907]: I0226 16:16:19.560125 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" event={"ID":"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede","Type":"ContainerDied","Data":"908f0905d9d7844d8bdc52f0285f095680e9fe1daa2db0d452e743bed012d919"} Feb 26 16:16:20 crc kubenswrapper[4907]: I0226 16:16:20.959480 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.030348 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-ssh-key-openstack-edpm-ipam\") pod \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.030401 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-inventory\") pod \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.030485 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fx8k\" (UniqueName: \"kubernetes.io/projected/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-kube-api-access-5fx8k\") pod \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\" (UID: \"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede\") " Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.035615 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-kube-api-access-5fx8k" (OuterVolumeSpecName: "kube-api-access-5fx8k") pod "93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" (UID: "93c0beb2-fc90-42f2-b4dd-f0f043cc0ede"). InnerVolumeSpecName "kube-api-access-5fx8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.061960 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-inventory" (OuterVolumeSpecName: "inventory") pod "93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" (UID: "93c0beb2-fc90-42f2-b4dd-f0f043cc0ede"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.062813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" (UID: "93c0beb2-fc90-42f2-b4dd-f0f043cc0ede"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.134881 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.134918 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.134927 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fx8k\" (UniqueName: \"kubernetes.io/projected/93c0beb2-fc90-42f2-b4dd-f0f043cc0ede-kube-api-access-5fx8k\") on node \"crc\" DevicePath \"\"" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.578932 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" event={"ID":"93c0beb2-fc90-42f2-b4dd-f0f043cc0ede","Type":"ContainerDied","Data":"307c5a9b61a987fad1fd69d926bb54bc4ce4b85963ac1f3abbfe660b3d7b05f2"} Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.578974 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="307c5a9b61a987fad1fd69d926bb54bc4ce4b85963ac1f3abbfe660b3d7b05f2" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.579045 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-blwb6" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.673241 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw"] Feb 26 16:16:21 crc kubenswrapper[4907]: E0226 16:16:21.673668 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.673686 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.673846 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="93c0beb2-fc90-42f2-b4dd-f0f043cc0ede" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.674523 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.677186 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.677300 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.677410 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.684382 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.689050 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw"] Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.747477 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq6pb\" (UniqueName: \"kubernetes.io/projected/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-kube-api-access-bq6pb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.747553 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.747690 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.849833 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.850000 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq6pb\" (UniqueName: \"kubernetes.io/projected/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-kube-api-access-bq6pb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.850042 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.857162 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.857650 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:21 crc kubenswrapper[4907]: I0226 16:16:21.870784 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq6pb\" (UniqueName: \"kubernetes.io/projected/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-kube-api-access-bq6pb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-tmgdw\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:22 crc kubenswrapper[4907]: I0226 16:16:22.044537 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:16:22 crc kubenswrapper[4907]: I0226 16:16:22.553147 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw"] Feb 26 16:16:22 crc kubenswrapper[4907]: I0226 16:16:22.588048 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" event={"ID":"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3","Type":"ContainerStarted","Data":"ce1f50fd0bd28093df7c2bcfbaa61757d50eb02edd9aa7ecb0a9b5d6c24624ea"} Feb 26 16:16:23 crc kubenswrapper[4907]: I0226 16:16:23.603243 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" event={"ID":"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3","Type":"ContainerStarted","Data":"ed53d5a52c2334e03e4e183c65075954bca517ac812fbaf3516322e0506c7d51"} Feb 26 16:16:23 crc kubenswrapper[4907]: I0226 16:16:23.630023 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" podStartSLOduration=2.229317505 podStartE2EDuration="2.630003496s" podCreationTimestamp="2026-02-26 16:16:21 +0000 UTC" firstStartedPulling="2026-02-26 16:16:22.561165592 +0000 UTC m=+2045.079727441" lastFinishedPulling="2026-02-26 16:16:22.961851583 +0000 UTC m=+2045.480413432" observedRunningTime="2026-02-26 16:16:23.621013335 +0000 UTC m=+2046.139575214" watchObservedRunningTime="2026-02-26 16:16:23.630003496 +0000 UTC m=+2046.148565345" Feb 26 16:16:48 crc kubenswrapper[4907]: I0226 16:16:48.529865 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:16:48 crc kubenswrapper[4907]: I0226 16:16:48.530484 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:16:55 crc kubenswrapper[4907]: I0226 16:16:55.048971 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4b5rh"] Feb 26 16:16:55 crc kubenswrapper[4907]: I0226 16:16:55.062475 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4b5rh"] Feb 26 16:16:56 crc kubenswrapper[4907]: I0226 16:16:56.150040 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f76f68d-7dee-4f14-9fb1-e943db5b0533" path="/var/lib/kubelet/pods/4f76f68d-7dee-4f14-9fb1-e943db5b0533/volumes" Feb 26 16:17:02 crc kubenswrapper[4907]: I0226 16:17:02.005055 4907 generic.go:334] "Generic (PLEG): container finished" podID="b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" containerID="ed53d5a52c2334e03e4e183c65075954bca517ac812fbaf3516322e0506c7d51" exitCode=0 Feb 26 16:17:02 crc kubenswrapper[4907]: I0226 16:17:02.005146 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" event={"ID":"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3","Type":"ContainerDied","Data":"ed53d5a52c2334e03e4e183c65075954bca517ac812fbaf3516322e0506c7d51"} Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.459467 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.640431 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-inventory\") pod \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.640539 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-ssh-key-openstack-edpm-ipam\") pod \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.640772 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq6pb\" (UniqueName: \"kubernetes.io/projected/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-kube-api-access-bq6pb\") pod \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\" (UID: \"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3\") " Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.649999 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-kube-api-access-bq6pb" (OuterVolumeSpecName: "kube-api-access-bq6pb") pod "b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" (UID: "b47b4d79-5f18-4d3d-8263-21fb9b0d31b3"). InnerVolumeSpecName "kube-api-access-bq6pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.674021 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" (UID: "b47b4d79-5f18-4d3d-8263-21fb9b0d31b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.677131 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-inventory" (OuterVolumeSpecName: "inventory") pod "b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" (UID: "b47b4d79-5f18-4d3d-8263-21fb9b0d31b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.743440 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq6pb\" (UniqueName: \"kubernetes.io/projected/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-kube-api-access-bq6pb\") on node \"crc\" DevicePath \"\"" Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.743506 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:17:03 crc kubenswrapper[4907]: I0226 16:17:03.743535 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b47b4d79-5f18-4d3d-8263-21fb9b0d31b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.023526 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" event={"ID":"b47b4d79-5f18-4d3d-8263-21fb9b0d31b3","Type":"ContainerDied","Data":"ce1f50fd0bd28093df7c2bcfbaa61757d50eb02edd9aa7ecb0a9b5d6c24624ea"} Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.023792 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce1f50fd0bd28093df7c2bcfbaa61757d50eb02edd9aa7ecb0a9b5d6c24624ea" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.023570 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-tmgdw" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.147726 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts"] Feb 26 16:17:04 crc kubenswrapper[4907]: E0226 16:17:04.148119 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.148141 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.148402 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b47b4d79-5f18-4d3d-8263-21fb9b0d31b3" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.149173 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.150925 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.151195 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.151456 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.151693 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.163131 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts"] Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.254603 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8tn4\" (UniqueName: \"kubernetes.io/projected/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-kube-api-access-z8tn4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.254701 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.254922 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.357064 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.357410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.357550 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8tn4\" (UniqueName: \"kubernetes.io/projected/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-kube-api-access-z8tn4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.361730 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.364106 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.377622 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8tn4\" (UniqueName: \"kubernetes.io/projected/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-kube-api-access-z8tn4\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:04 crc kubenswrapper[4907]: I0226 16:17:04.470119 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:05 crc kubenswrapper[4907]: I0226 16:17:05.040170 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts"] Feb 26 16:17:06 crc kubenswrapper[4907]: I0226 16:17:06.053512 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" event={"ID":"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361","Type":"ContainerStarted","Data":"28d6969f3a321ee7d3db251dd642f9037ee85ea7e06fce9cab06871de3360f25"} Feb 26 16:17:06 crc kubenswrapper[4907]: I0226 16:17:06.053970 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" event={"ID":"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361","Type":"ContainerStarted","Data":"96864ad5155aa8582cc6c321d173b4f3e84b1edd195d36fc5f75c316662318f0"} Feb 26 16:17:06 crc kubenswrapper[4907]: I0226 16:17:06.077354 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" podStartSLOduration=1.506363728 podStartE2EDuration="2.077329285s" podCreationTimestamp="2026-02-26 16:17:04 +0000 UTC" firstStartedPulling="2026-02-26 16:17:05.04911816 +0000 UTC m=+2087.567680009" lastFinishedPulling="2026-02-26 16:17:05.620083717 +0000 UTC m=+2088.138645566" observedRunningTime="2026-02-26 16:17:06.066698844 +0000 UTC m=+2088.585260693" watchObservedRunningTime="2026-02-26 16:17:06.077329285 +0000 UTC m=+2088.595891134" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.096298 4907 scope.go:117] "RemoveContainer" containerID="ece086edb6b098d7879ce0ac6c6001c702f186355bc8ed7b7ba91efeddeeb86c" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.124009 4907 scope.go:117] "RemoveContainer" containerID="42272f3871ae911865b015890b9c14369f0fb3712a8201d406df6721ba5053ed" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.161154 4907 scope.go:117] "RemoveContainer" containerID="b2ca0f3286f6aeae72b4c27b418e941b8810b0b4f88bfdc3d64534da2f76ef0a" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.212975 4907 scope.go:117] "RemoveContainer" containerID="31c89ce7138a9b38c8adebdce017bebb3b83fb10bd86f4d1277c6e732265a1f8" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.251572 4907 scope.go:117] "RemoveContainer" containerID="be57f9925a937096f9d5f1073c8e249f92a0cdc53e6b154b95350aeea9a8f037" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.283418 4907 scope.go:117] "RemoveContainer" containerID="5ba3eca10926a5a21a0d77adac5eccd592a1bba624a8872656821c5c6390f8ab" Feb 26 16:17:10 crc kubenswrapper[4907]: I0226 16:17:10.323238 4907 scope.go:117] "RemoveContainer" containerID="58c078032ab12fb83c6311cbdcdc3b8ee96d3cfbe4c26e379f2a6f7f7387574a" Feb 26 16:17:18 crc kubenswrapper[4907]: I0226 16:17:18.529977 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:17:18 crc kubenswrapper[4907]: I0226 16:17:18.530558 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:17:18 crc kubenswrapper[4907]: I0226 16:17:18.530631 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:17:18 crc kubenswrapper[4907]: I0226 16:17:18.531430 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eeafebf90768294d93b5a754d4be3f7e7e83781774c84e4b268744314a564bb2"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:17:18 crc kubenswrapper[4907]: I0226 16:17:18.531485 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://eeafebf90768294d93b5a754d4be3f7e7e83781774c84e4b268744314a564bb2" gracePeriod=600 Feb 26 16:17:19 crc kubenswrapper[4907]: I0226 16:17:19.181606 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="eeafebf90768294d93b5a754d4be3f7e7e83781774c84e4b268744314a564bb2" exitCode=0 Feb 26 16:17:19 crc kubenswrapper[4907]: I0226 16:17:19.181636 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"eeafebf90768294d93b5a754d4be3f7e7e83781774c84e4b268744314a564bb2"} Feb 26 16:17:19 crc kubenswrapper[4907]: I0226 16:17:19.181995 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085"} Feb 26 16:17:19 crc kubenswrapper[4907]: I0226 16:17:19.182020 4907 scope.go:117] "RemoveContainer" containerID="b46bef3acd92cfa3cb8f5894a729a1bb1795fbc69b7b7c5835186a0b609a6e46" Feb 26 16:17:20 crc kubenswrapper[4907]: I0226 16:17:20.063058 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-6bjsz"] Feb 26 16:17:20 crc kubenswrapper[4907]: I0226 16:17:20.076969 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-6bjsz"] Feb 26 16:17:20 crc kubenswrapper[4907]: I0226 16:17:20.137546 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495e976f-e5c3-4fe4-9a08-12e01970b48d" path="/var/lib/kubelet/pods/495e976f-e5c3-4fe4-9a08-12e01970b48d/volumes" Feb 26 16:17:25 crc kubenswrapper[4907]: I0226 16:17:25.026848 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l2rsl"] Feb 26 16:17:25 crc kubenswrapper[4907]: I0226 16:17:25.034581 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-l2rsl"] Feb 26 16:17:26 crc kubenswrapper[4907]: I0226 16:17:26.143951 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2" path="/var/lib/kubelet/pods/e85bd7a8-59b8-4eca-a1d6-2824c3e44dd2/volumes" Feb 26 16:17:54 crc kubenswrapper[4907]: I0226 16:17:54.639065 4907 generic.go:334] "Generic (PLEG): container finished" podID="e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" containerID="28d6969f3a321ee7d3db251dd642f9037ee85ea7e06fce9cab06871de3360f25" exitCode=0 Feb 26 16:17:54 crc kubenswrapper[4907]: I0226 16:17:54.639841 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" event={"ID":"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361","Type":"ContainerDied","Data":"28d6969f3a321ee7d3db251dd642f9037ee85ea7e06fce9cab06871de3360f25"} Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.053744 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.228622 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-ssh-key-openstack-edpm-ipam\") pod \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.228731 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-inventory\") pod \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.228846 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8tn4\" (UniqueName: \"kubernetes.io/projected/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-kube-api-access-z8tn4\") pod \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\" (UID: \"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361\") " Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.235360 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-kube-api-access-z8tn4" (OuterVolumeSpecName: "kube-api-access-z8tn4") pod "e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" (UID: "e9a87b6e-5a0f-4201-b6d1-a1cd0d224361"). InnerVolumeSpecName "kube-api-access-z8tn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.257091 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-inventory" (OuterVolumeSpecName: "inventory") pod "e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" (UID: "e9a87b6e-5a0f-4201-b6d1-a1cd0d224361"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.258957 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" (UID: "e9a87b6e-5a0f-4201-b6d1-a1cd0d224361"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.330837 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.330875 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8tn4\" (UniqueName: \"kubernetes.io/projected/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-kube-api-access-z8tn4\") on node \"crc\" DevicePath \"\"" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.330889 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e9a87b6e-5a0f-4201-b6d1-a1cd0d224361-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.658234 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.658216 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts" event={"ID":"e9a87b6e-5a0f-4201-b6d1-a1cd0d224361","Type":"ContainerDied","Data":"96864ad5155aa8582cc6c321d173b4f3e84b1edd195d36fc5f75c316662318f0"} Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.658430 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96864ad5155aa8582cc6c321d173b4f3e84b1edd195d36fc5f75c316662318f0" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.736546 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-skfk6"] Feb 26 16:17:56 crc kubenswrapper[4907]: E0226 16:17:56.736943 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.736965 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.737152 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9a87b6e-5a0f-4201-b6d1-a1cd0d224361" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.737769 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.739921 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.740077 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.743439 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.746521 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.757209 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-skfk6"] Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.840016 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.840069 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g6fj\" (UniqueName: \"kubernetes.io/projected/4ba81fad-7677-4ea8-b338-09ef7f73f63b-kube-api-access-4g6fj\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.840384 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.942563 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.942697 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.942998 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g6fj\" (UniqueName: \"kubernetes.io/projected/4ba81fad-7677-4ea8-b338-09ef7f73f63b-kube-api-access-4g6fj\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.946356 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.955605 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:56 crc kubenswrapper[4907]: I0226 16:17:56.958520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g6fj\" (UniqueName: \"kubernetes.io/projected/4ba81fad-7677-4ea8-b338-09ef7f73f63b-kube-api-access-4g6fj\") pod \"ssh-known-hosts-edpm-deployment-skfk6\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:57 crc kubenswrapper[4907]: I0226 16:17:57.106296 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:17:57 crc kubenswrapper[4907]: W0226 16:17:57.683655 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ba81fad_7677_4ea8_b338_09ef7f73f63b.slice/crio-fdd2c0a23b4168849bf829c45a4d0531423023da2dc37af2592c15b690be0e01 WatchSource:0}: Error finding container fdd2c0a23b4168849bf829c45a4d0531423023da2dc37af2592c15b690be0e01: Status 404 returned error can't find the container with id fdd2c0a23b4168849bf829c45a4d0531423023da2dc37af2592c15b690be0e01 Feb 26 16:17:57 crc kubenswrapper[4907]: I0226 16:17:57.687866 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-skfk6"] Feb 26 16:17:58 crc kubenswrapper[4907]: I0226 16:17:58.676346 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" event={"ID":"4ba81fad-7677-4ea8-b338-09ef7f73f63b","Type":"ContainerStarted","Data":"13c2c84d1a18c231c7b58ec7068ebb9f56509304cfa9653183547c7854b06784"} Feb 26 16:17:58 crc kubenswrapper[4907]: I0226 16:17:58.676791 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" event={"ID":"4ba81fad-7677-4ea8-b338-09ef7f73f63b","Type":"ContainerStarted","Data":"fdd2c0a23b4168849bf829c45a4d0531423023da2dc37af2592c15b690be0e01"} Feb 26 16:17:58 crc kubenswrapper[4907]: I0226 16:17:58.697774 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" podStartSLOduration=2.033089384 podStartE2EDuration="2.697747481s" podCreationTimestamp="2026-02-26 16:17:56 +0000 UTC" firstStartedPulling="2026-02-26 16:17:57.687042064 +0000 UTC m=+2140.205603913" lastFinishedPulling="2026-02-26 16:17:58.351700161 +0000 UTC m=+2140.870262010" observedRunningTime="2026-02-26 16:17:58.691649152 +0000 UTC m=+2141.210211011" watchObservedRunningTime="2026-02-26 16:17:58.697747481 +0000 UTC m=+2141.216309330" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.140218 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535378-pz6jl"] Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.142558 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.148727 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.148768 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.148949 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.151780 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5btj7\" (UniqueName: \"kubernetes.io/projected/8437f994-5cf4-40bf-b425-300e97b74aed-kube-api-access-5btj7\") pod \"auto-csr-approver-29535378-pz6jl\" (UID: \"8437f994-5cf4-40bf-b425-300e97b74aed\") " pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.152565 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-pz6jl"] Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.253470 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5btj7\" (UniqueName: \"kubernetes.io/projected/8437f994-5cf4-40bf-b425-300e97b74aed-kube-api-access-5btj7\") pod \"auto-csr-approver-29535378-pz6jl\" (UID: \"8437f994-5cf4-40bf-b425-300e97b74aed\") " pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.273576 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5btj7\" (UniqueName: \"kubernetes.io/projected/8437f994-5cf4-40bf-b425-300e97b74aed-kube-api-access-5btj7\") pod \"auto-csr-approver-29535378-pz6jl\" (UID: \"8437f994-5cf4-40bf-b425-300e97b74aed\") " pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.459650 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:00 crc kubenswrapper[4907]: I0226 16:18:00.949868 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-pz6jl"] Feb 26 16:18:01 crc kubenswrapper[4907]: I0226 16:18:01.702369 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" event={"ID":"8437f994-5cf4-40bf-b425-300e97b74aed","Type":"ContainerStarted","Data":"25b7a72818291973332c212dff0d7bd1b3a28ecc2f86126722c8d2d2206961f4"} Feb 26 16:18:02 crc kubenswrapper[4907]: I0226 16:18:02.711750 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" event={"ID":"8437f994-5cf4-40bf-b425-300e97b74aed","Type":"ContainerStarted","Data":"b4c2aba71af8a10b65fdb2c26b42db8cb361dd18def93a088e174c6e581fe7bb"} Feb 26 16:18:02 crc kubenswrapper[4907]: I0226 16:18:02.728175 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" podStartSLOduration=1.476582616 podStartE2EDuration="2.728148021s" podCreationTimestamp="2026-02-26 16:18:00 +0000 UTC" firstStartedPulling="2026-02-26 16:18:00.95622027 +0000 UTC m=+2143.474782119" lastFinishedPulling="2026-02-26 16:18:02.207785675 +0000 UTC m=+2144.726347524" observedRunningTime="2026-02-26 16:18:02.723619921 +0000 UTC m=+2145.242181760" watchObservedRunningTime="2026-02-26 16:18:02.728148021 +0000 UTC m=+2145.246709870" Feb 26 16:18:03 crc kubenswrapper[4907]: I0226 16:18:03.721140 4907 generic.go:334] "Generic (PLEG): container finished" podID="8437f994-5cf4-40bf-b425-300e97b74aed" containerID="b4c2aba71af8a10b65fdb2c26b42db8cb361dd18def93a088e174c6e581fe7bb" exitCode=0 Feb 26 16:18:03 crc kubenswrapper[4907]: I0226 16:18:03.721493 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" event={"ID":"8437f994-5cf4-40bf-b425-300e97b74aed","Type":"ContainerDied","Data":"b4c2aba71af8a10b65fdb2c26b42db8cb361dd18def93a088e174c6e581fe7bb"} Feb 26 16:18:04 crc kubenswrapper[4907]: I0226 16:18:04.059975 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-bck8j"] Feb 26 16:18:04 crc kubenswrapper[4907]: I0226 16:18:04.076914 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-bck8j"] Feb 26 16:18:04 crc kubenswrapper[4907]: I0226 16:18:04.140313 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f495d649-0f7a-4520-84ad-7703ad452593" path="/var/lib/kubelet/pods/f495d649-0f7a-4520-84ad-7703ad452593/volumes" Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.110832 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.267277 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5btj7\" (UniqueName: \"kubernetes.io/projected/8437f994-5cf4-40bf-b425-300e97b74aed-kube-api-access-5btj7\") pod \"8437f994-5cf4-40bf-b425-300e97b74aed\" (UID: \"8437f994-5cf4-40bf-b425-300e97b74aed\") " Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.272824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8437f994-5cf4-40bf-b425-300e97b74aed-kube-api-access-5btj7" (OuterVolumeSpecName: "kube-api-access-5btj7") pod "8437f994-5cf4-40bf-b425-300e97b74aed" (UID: "8437f994-5cf4-40bf-b425-300e97b74aed"). InnerVolumeSpecName "kube-api-access-5btj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.369893 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5btj7\" (UniqueName: \"kubernetes.io/projected/8437f994-5cf4-40bf-b425-300e97b74aed-kube-api-access-5btj7\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.739318 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" event={"ID":"8437f994-5cf4-40bf-b425-300e97b74aed","Type":"ContainerDied","Data":"25b7a72818291973332c212dff0d7bd1b3a28ecc2f86126722c8d2d2206961f4"} Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.739934 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25b7a72818291973332c212dff0d7bd1b3a28ecc2f86126722c8d2d2206961f4" Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.739330 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535378-pz6jl" Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.740810 4907 generic.go:334] "Generic (PLEG): container finished" podID="4ba81fad-7677-4ea8-b338-09ef7f73f63b" containerID="13c2c84d1a18c231c7b58ec7068ebb9f56509304cfa9653183547c7854b06784" exitCode=0 Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.740848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" event={"ID":"4ba81fad-7677-4ea8-b338-09ef7f73f63b","Type":"ContainerDied","Data":"13c2c84d1a18c231c7b58ec7068ebb9f56509304cfa9653183547c7854b06784"} Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.817487 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-742br"] Feb 26 16:18:05 crc kubenswrapper[4907]: I0226 16:18:05.825095 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535372-742br"] Feb 26 16:18:06 crc kubenswrapper[4907]: I0226 16:18:06.140195 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6496c1fc-cf88-488c-bcf6-5bae57ca88bf" path="/var/lib/kubelet/pods/6496c1fc-cf88-488c-bcf6-5bae57ca88bf/volumes" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.165952 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.307075 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-ssh-key-openstack-edpm-ipam\") pod \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.307135 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g6fj\" (UniqueName: \"kubernetes.io/projected/4ba81fad-7677-4ea8-b338-09ef7f73f63b-kube-api-access-4g6fj\") pod \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.307253 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-inventory-0\") pod \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\" (UID: \"4ba81fad-7677-4ea8-b338-09ef7f73f63b\") " Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.313289 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba81fad-7677-4ea8-b338-09ef7f73f63b-kube-api-access-4g6fj" (OuterVolumeSpecName: "kube-api-access-4g6fj") pod "4ba81fad-7677-4ea8-b338-09ef7f73f63b" (UID: "4ba81fad-7677-4ea8-b338-09ef7f73f63b"). InnerVolumeSpecName "kube-api-access-4g6fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.333965 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4ba81fad-7677-4ea8-b338-09ef7f73f63b" (UID: "4ba81fad-7677-4ea8-b338-09ef7f73f63b"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.339365 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4ba81fad-7677-4ea8-b338-09ef7f73f63b" (UID: "4ba81fad-7677-4ea8-b338-09ef7f73f63b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.409822 4907 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.409860 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4ba81fad-7677-4ea8-b338-09ef7f73f63b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.409870 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g6fj\" (UniqueName: \"kubernetes.io/projected/4ba81fad-7677-4ea8-b338-09ef7f73f63b-kube-api-access-4g6fj\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.757982 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" event={"ID":"4ba81fad-7677-4ea8-b338-09ef7f73f63b","Type":"ContainerDied","Data":"fdd2c0a23b4168849bf829c45a4d0531423023da2dc37af2592c15b690be0e01"} Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.758026 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdd2c0a23b4168849bf829c45a4d0531423023da2dc37af2592c15b690be0e01" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.758090 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-skfk6" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.844676 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz"] Feb 26 16:18:07 crc kubenswrapper[4907]: E0226 16:18:07.845256 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba81fad-7677-4ea8-b338-09ef7f73f63b" containerName="ssh-known-hosts-edpm-deployment" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.845270 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba81fad-7677-4ea8-b338-09ef7f73f63b" containerName="ssh-known-hosts-edpm-deployment" Feb 26 16:18:07 crc kubenswrapper[4907]: E0226 16:18:07.845289 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8437f994-5cf4-40bf-b425-300e97b74aed" containerName="oc" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.845295 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="8437f994-5cf4-40bf-b425-300e97b74aed" containerName="oc" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.845464 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba81fad-7677-4ea8-b338-09ef7f73f63b" containerName="ssh-known-hosts-edpm-deployment" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.845484 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="8437f994-5cf4-40bf-b425-300e97b74aed" containerName="oc" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.846055 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.848212 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.848944 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.849116 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.849266 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:18:07 crc kubenswrapper[4907]: I0226 16:18:07.858175 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz"] Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.020285 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.020762 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.021041 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gshr5\" (UniqueName: \"kubernetes.io/projected/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-kube-api-access-gshr5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.123002 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.123061 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gshr5\" (UniqueName: \"kubernetes.io/projected/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-kube-api-access-gshr5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.123110 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.135520 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.138432 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.141336 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gshr5\" (UniqueName: \"kubernetes.io/projected/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-kube-api-access-gshr5\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8fgwz\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.168554 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.708025 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz"] Feb 26 16:18:08 crc kubenswrapper[4907]: I0226 16:18:08.765948 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" event={"ID":"bc0bd13e-cad0-4a21-856b-aaf97d65cec2","Type":"ContainerStarted","Data":"49608779ad8647e1b21a0eddb7c723830b8a4d60fc951cd56f4f49e1f75c89db"} Feb 26 16:18:10 crc kubenswrapper[4907]: I0226 16:18:10.441971 4907 scope.go:117] "RemoveContainer" containerID="c3b1f7c4476fdadc4251d9974969d30ed70f9f124e466d21df11ac454b81ccc0" Feb 26 16:18:10 crc kubenswrapper[4907]: I0226 16:18:10.490604 4907 scope.go:117] "RemoveContainer" containerID="14b832f80d0c811e1ff07311ab483873a14459b4798c7d0d6f09f8e79617e33d" Feb 26 16:18:10 crc kubenswrapper[4907]: I0226 16:18:10.553381 4907 scope.go:117] "RemoveContainer" containerID="4bc4e33d37c8a0f3832c0c5a604e6f722245320712bc4bac5d6a6004557b6be8" Feb 26 16:18:10 crc kubenswrapper[4907]: I0226 16:18:10.613247 4907 scope.go:117] "RemoveContainer" containerID="fe36a4a7da824268ac681e7da109253b6e256198fd7296219b2d6cc025d3f6ab" Feb 26 16:18:10 crc kubenswrapper[4907]: I0226 16:18:10.781335 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" event={"ID":"bc0bd13e-cad0-4a21-856b-aaf97d65cec2","Type":"ContainerStarted","Data":"e85dbaefdea65907904a6cd28579f8c8f04d13f9135f4fe8c3c2b78f5eae5332"} Feb 26 16:18:10 crc kubenswrapper[4907]: I0226 16:18:10.825941 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" podStartSLOduration=2.61490861 podStartE2EDuration="3.825920411s" podCreationTimestamp="2026-02-26 16:18:07 +0000 UTC" firstStartedPulling="2026-02-26 16:18:08.708463122 +0000 UTC m=+2151.227024971" lastFinishedPulling="2026-02-26 16:18:09.919474923 +0000 UTC m=+2152.438036772" observedRunningTime="2026-02-26 16:18:10.800927289 +0000 UTC m=+2153.319489138" watchObservedRunningTime="2026-02-26 16:18:10.825920411 +0000 UTC m=+2153.344482260" Feb 26 16:18:21 crc kubenswrapper[4907]: I0226 16:18:21.918720 4907 generic.go:334] "Generic (PLEG): container finished" podID="bc0bd13e-cad0-4a21-856b-aaf97d65cec2" containerID="e85dbaefdea65907904a6cd28579f8c8f04d13f9135f4fe8c3c2b78f5eae5332" exitCode=0 Feb 26 16:18:21 crc kubenswrapper[4907]: I0226 16:18:21.918775 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" event={"ID":"bc0bd13e-cad0-4a21-856b-aaf97d65cec2","Type":"ContainerDied","Data":"e85dbaefdea65907904a6cd28579f8c8f04d13f9135f4fe8c3c2b78f5eae5332"} Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.535987 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.725836 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gshr5\" (UniqueName: \"kubernetes.io/projected/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-kube-api-access-gshr5\") pod \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.726092 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-inventory\") pod \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.726140 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-ssh-key-openstack-edpm-ipam\") pod \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\" (UID: \"bc0bd13e-cad0-4a21-856b-aaf97d65cec2\") " Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.731458 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-kube-api-access-gshr5" (OuterVolumeSpecName: "kube-api-access-gshr5") pod "bc0bd13e-cad0-4a21-856b-aaf97d65cec2" (UID: "bc0bd13e-cad0-4a21-856b-aaf97d65cec2"). InnerVolumeSpecName "kube-api-access-gshr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.755360 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-inventory" (OuterVolumeSpecName: "inventory") pod "bc0bd13e-cad0-4a21-856b-aaf97d65cec2" (UID: "bc0bd13e-cad0-4a21-856b-aaf97d65cec2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.762869 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bc0bd13e-cad0-4a21-856b-aaf97d65cec2" (UID: "bc0bd13e-cad0-4a21-856b-aaf97d65cec2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.828820 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.828902 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.828915 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gshr5\" (UniqueName: \"kubernetes.io/projected/bc0bd13e-cad0-4a21-856b-aaf97d65cec2-kube-api-access-gshr5\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.938649 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" event={"ID":"bc0bd13e-cad0-4a21-856b-aaf97d65cec2","Type":"ContainerDied","Data":"49608779ad8647e1b21a0eddb7c723830b8a4d60fc951cd56f4f49e1f75c89db"} Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.938889 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49608779ad8647e1b21a0eddb7c723830b8a4d60fc951cd56f4f49e1f75c89db" Feb 26 16:18:23 crc kubenswrapper[4907]: I0226 16:18:23.939066 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8fgwz" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.040811 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2"] Feb 26 16:18:24 crc kubenswrapper[4907]: E0226 16:18:24.041281 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0bd13e-cad0-4a21-856b-aaf97d65cec2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.041307 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0bd13e-cad0-4a21-856b-aaf97d65cec2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.041549 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0bd13e-cad0-4a21-856b-aaf97d65cec2" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.042281 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.044020 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.044022 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.044556 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.044788 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.059526 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2"] Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.235356 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.235786 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfsvd\" (UniqueName: \"kubernetes.io/projected/348e7351-416b-4791-b202-46ce193e0c6e-kube-api-access-vfsvd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.236106 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.337993 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.338121 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.338170 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfsvd\" (UniqueName: \"kubernetes.io/projected/348e7351-416b-4791-b202-46ce193e0c6e-kube-api-access-vfsvd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.348490 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.348508 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.358556 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfsvd\" (UniqueName: \"kubernetes.io/projected/348e7351-416b-4791-b202-46ce193e0c6e-kube-api-access-vfsvd\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.362574 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.907413 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2"] Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.920006 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:18:24 crc kubenswrapper[4907]: I0226 16:18:24.948827 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" event={"ID":"348e7351-416b-4791-b202-46ce193e0c6e","Type":"ContainerStarted","Data":"3816574dd8fe13f7b0c526a124b3d43d9fd20b8a9f1fc7d72e86bdbea8857195"} Feb 26 16:18:25 crc kubenswrapper[4907]: I0226 16:18:25.957699 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" event={"ID":"348e7351-416b-4791-b202-46ce193e0c6e","Type":"ContainerStarted","Data":"aa24c8b9ee8c8b7d4080cec44089e0f5fd7f001cc4da702f7f7380a92e27295e"} Feb 26 16:18:25 crc kubenswrapper[4907]: I0226 16:18:25.977043 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" podStartSLOduration=1.487149299 podStartE2EDuration="1.977025108s" podCreationTimestamp="2026-02-26 16:18:24 +0000 UTC" firstStartedPulling="2026-02-26 16:18:24.919764189 +0000 UTC m=+2167.438326038" lastFinishedPulling="2026-02-26 16:18:25.409639988 +0000 UTC m=+2167.928201847" observedRunningTime="2026-02-26 16:18:25.975103481 +0000 UTC m=+2168.493665340" watchObservedRunningTime="2026-02-26 16:18:25.977025108 +0000 UTC m=+2168.495586977" Feb 26 16:18:35 crc kubenswrapper[4907]: I0226 16:18:35.032712 4907 generic.go:334] "Generic (PLEG): container finished" podID="348e7351-416b-4791-b202-46ce193e0c6e" containerID="aa24c8b9ee8c8b7d4080cec44089e0f5fd7f001cc4da702f7f7380a92e27295e" exitCode=0 Feb 26 16:18:35 crc kubenswrapper[4907]: I0226 16:18:35.032808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" event={"ID":"348e7351-416b-4791-b202-46ce193e0c6e","Type":"ContainerDied","Data":"aa24c8b9ee8c8b7d4080cec44089e0f5fd7f001cc4da702f7f7380a92e27295e"} Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.486307 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.550450 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfsvd\" (UniqueName: \"kubernetes.io/projected/348e7351-416b-4791-b202-46ce193e0c6e-kube-api-access-vfsvd\") pod \"348e7351-416b-4791-b202-46ce193e0c6e\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.550575 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-inventory\") pod \"348e7351-416b-4791-b202-46ce193e0c6e\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.550669 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-ssh-key-openstack-edpm-ipam\") pod \"348e7351-416b-4791-b202-46ce193e0c6e\" (UID: \"348e7351-416b-4791-b202-46ce193e0c6e\") " Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.556914 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348e7351-416b-4791-b202-46ce193e0c6e-kube-api-access-vfsvd" (OuterVolumeSpecName: "kube-api-access-vfsvd") pod "348e7351-416b-4791-b202-46ce193e0c6e" (UID: "348e7351-416b-4791-b202-46ce193e0c6e"). InnerVolumeSpecName "kube-api-access-vfsvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.586961 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "348e7351-416b-4791-b202-46ce193e0c6e" (UID: "348e7351-416b-4791-b202-46ce193e0c6e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.590310 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-inventory" (OuterVolumeSpecName: "inventory") pod "348e7351-416b-4791-b202-46ce193e0c6e" (UID: "348e7351-416b-4791-b202-46ce193e0c6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.667793 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfsvd\" (UniqueName: \"kubernetes.io/projected/348e7351-416b-4791-b202-46ce193e0c6e-kube-api-access-vfsvd\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.668810 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:36 crc kubenswrapper[4907]: I0226 16:18:36.668878 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/348e7351-416b-4791-b202-46ce193e0c6e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.051981 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" event={"ID":"348e7351-416b-4791-b202-46ce193e0c6e","Type":"ContainerDied","Data":"3816574dd8fe13f7b0c526a124b3d43d9fd20b8a9f1fc7d72e86bdbea8857195"} Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.052320 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3816574dd8fe13f7b0c526a124b3d43d9fd20b8a9f1fc7d72e86bdbea8857195" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.052066 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.583293 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9"] Feb 26 16:18:37 crc kubenswrapper[4907]: E0226 16:18:37.583703 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="348e7351-416b-4791-b202-46ce193e0c6e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.583717 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="348e7351-416b-4791-b202-46ce193e0c6e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.583936 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="348e7351-416b-4791-b202-46ce193e0c6e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.584605 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.588251 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.588249 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.588270 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.589363 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.591148 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.591186 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.591354 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.591889 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.618921 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9"] Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684238 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684297 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpnff\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-kube-api-access-xpnff\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684322 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684348 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684369 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684482 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684574 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684645 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684692 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684719 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684756 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684791 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684837 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.684862 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786209 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786284 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786316 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786348 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786374 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786395 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786434 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786462 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786485 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786539 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786574 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpnff\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-kube-api-access-xpnff\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786623 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.786671 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.792894 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.792972 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.793132 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.793354 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.793362 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.794711 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.795806 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.796128 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.796713 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.796853 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.798082 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.798790 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.804537 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.806544 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpnff\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-kube-api-access-xpnff\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:37 crc kubenswrapper[4907]: I0226 16:18:37.909136 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:18:38 crc kubenswrapper[4907]: I0226 16:18:38.463387 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9"] Feb 26 16:18:39 crc kubenswrapper[4907]: I0226 16:18:39.074530 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" event={"ID":"be614198-ac98-4ed9-926b-c1a2aa9789c5","Type":"ContainerStarted","Data":"690da2a20b7c430b951f7e46f6c78ffaf95ddfcf7162ca05c78782ccff78e570"} Feb 26 16:18:40 crc kubenswrapper[4907]: I0226 16:18:40.083317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" event={"ID":"be614198-ac98-4ed9-926b-c1a2aa9789c5","Type":"ContainerStarted","Data":"a8defb98e9a2fb3126a45788dc4555046e3f32194e2a90c806326ae0f98c050c"} Feb 26 16:18:40 crc kubenswrapper[4907]: I0226 16:18:40.109473 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" podStartSLOduration=2.626053484 podStartE2EDuration="3.109452863s" podCreationTimestamp="2026-02-26 16:18:37 +0000 UTC" firstStartedPulling="2026-02-26 16:18:38.460921268 +0000 UTC m=+2180.979483137" lastFinishedPulling="2026-02-26 16:18:38.944320667 +0000 UTC m=+2181.462882516" observedRunningTime="2026-02-26 16:18:40.101330033 +0000 UTC m=+2182.619891882" watchObservedRunningTime="2026-02-26 16:18:40.109452863 +0000 UTC m=+2182.628014712" Feb 26 16:18:58 crc kubenswrapper[4907]: I0226 16:18:58.852265 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98mbl"] Feb 26 16:18:58 crc kubenswrapper[4907]: I0226 16:18:58.855994 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:58 crc kubenswrapper[4907]: I0226 16:18:58.920103 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98mbl"] Feb 26 16:18:58 crc kubenswrapper[4907]: I0226 16:18:58.971740 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-catalog-content\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:58 crc kubenswrapper[4907]: I0226 16:18:58.971916 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dbwd\" (UniqueName: \"kubernetes.io/projected/5cba1d8d-90b5-4317-8783-e015167f210a-kube-api-access-4dbwd\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:58 crc kubenswrapper[4907]: I0226 16:18:58.971945 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-utilities\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.073856 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dbwd\" (UniqueName: \"kubernetes.io/projected/5cba1d8d-90b5-4317-8783-e015167f210a-kube-api-access-4dbwd\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.073924 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-utilities\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.073963 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-catalog-content\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.074582 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-catalog-content\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.074635 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-utilities\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.096140 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dbwd\" (UniqueName: \"kubernetes.io/projected/5cba1d8d-90b5-4317-8783-e015167f210a-kube-api-access-4dbwd\") pod \"redhat-operators-98mbl\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.175357 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:18:59 crc kubenswrapper[4907]: I0226 16:18:59.688752 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98mbl"] Feb 26 16:19:00 crc kubenswrapper[4907]: I0226 16:19:00.258217 4907 generic.go:334] "Generic (PLEG): container finished" podID="5cba1d8d-90b5-4317-8783-e015167f210a" containerID="aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf" exitCode=0 Feb 26 16:19:00 crc kubenswrapper[4907]: I0226 16:19:00.258259 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerDied","Data":"aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf"} Feb 26 16:19:00 crc kubenswrapper[4907]: I0226 16:19:00.260016 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerStarted","Data":"a6420d415d81708da25eb47ab5a67e3591131c6ca7f922c6c752c4a8c8ddd086"} Feb 26 16:19:01 crc kubenswrapper[4907]: I0226 16:19:01.268523 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerStarted","Data":"1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35"} Feb 26 16:19:08 crc kubenswrapper[4907]: I0226 16:19:08.324873 4907 generic.go:334] "Generic (PLEG): container finished" podID="5cba1d8d-90b5-4317-8783-e015167f210a" containerID="1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35" exitCode=0 Feb 26 16:19:08 crc kubenswrapper[4907]: I0226 16:19:08.324919 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerDied","Data":"1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35"} Feb 26 16:19:09 crc kubenswrapper[4907]: I0226 16:19:09.336979 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerStarted","Data":"169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e"} Feb 26 16:19:09 crc kubenswrapper[4907]: I0226 16:19:09.361966 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98mbl" podStartSLOduration=2.673265722 podStartE2EDuration="11.36194913s" podCreationTimestamp="2026-02-26 16:18:58 +0000 UTC" firstStartedPulling="2026-02-26 16:19:00.259648987 +0000 UTC m=+2202.778210836" lastFinishedPulling="2026-02-26 16:19:08.948332375 +0000 UTC m=+2211.466894244" observedRunningTime="2026-02-26 16:19:09.358104305 +0000 UTC m=+2211.876666154" watchObservedRunningTime="2026-02-26 16:19:09.36194913 +0000 UTC m=+2211.880510999" Feb 26 16:19:14 crc kubenswrapper[4907]: I0226 16:19:14.376731 4907 generic.go:334] "Generic (PLEG): container finished" podID="be614198-ac98-4ed9-926b-c1a2aa9789c5" containerID="a8defb98e9a2fb3126a45788dc4555046e3f32194e2a90c806326ae0f98c050c" exitCode=0 Feb 26 16:19:14 crc kubenswrapper[4907]: I0226 16:19:14.376861 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" event={"ID":"be614198-ac98-4ed9-926b-c1a2aa9789c5","Type":"ContainerDied","Data":"a8defb98e9a2fb3126a45788dc4555046e3f32194e2a90c806326ae0f98c050c"} Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.871238 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991694 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-bootstrap-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991746 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-inventory\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991871 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-ovn-default-certs-0\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991899 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991922 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-neutron-metadata-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.991951 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ssh-key-openstack-edpm-ipam\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992011 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-telemetry-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992042 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992062 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-nova-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992097 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpnff\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-kube-api-access-xpnff\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992115 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ovn-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992129 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-repo-setup-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.992189 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-libvirt-combined-ca-bundle\") pod \"be614198-ac98-4ed9-926b-c1a2aa9789c5\" (UID: \"be614198-ac98-4ed9-926b-c1a2aa9789c5\") " Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.997155 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:19:15 crc kubenswrapper[4907]: I0226 16:19:15.998991 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.001748 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.002508 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.002816 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.003132 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-kube-api-access-xpnff" (OuterVolumeSpecName: "kube-api-access-xpnff") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "kube-api-access-xpnff". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.005702 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.006135 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.006275 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.007992 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.008750 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.014775 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.033624 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.054840 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-inventory" (OuterVolumeSpecName: "inventory") pod "be614198-ac98-4ed9-926b-c1a2aa9789c5" (UID: "be614198-ac98-4ed9-926b-c1a2aa9789c5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094501 4907 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094545 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094561 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094575 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094611 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094626 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094668 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094684 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094706 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094726 4907 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094759 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpnff\" (UniqueName: \"kubernetes.io/projected/be614198-ac98-4ed9-926b-c1a2aa9789c5-kube-api-access-xpnff\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094771 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094786 4907 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.094801 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be614198-ac98-4ed9-926b-c1a2aa9789c5-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.399059 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" event={"ID":"be614198-ac98-4ed9-926b-c1a2aa9789c5","Type":"ContainerDied","Data":"690da2a20b7c430b951f7e46f6c78ffaf95ddfcf7162ca05c78782ccff78e570"} Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.399431 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="690da2a20b7c430b951f7e46f6c78ffaf95ddfcf7162ca05c78782ccff78e570" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.399286 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.513884 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8"] Feb 26 16:19:16 crc kubenswrapper[4907]: E0226 16:19:16.514262 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be614198-ac98-4ed9-926b-c1a2aa9789c5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.514285 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="be614198-ac98-4ed9-926b-c1a2aa9789c5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.514535 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="be614198-ac98-4ed9-926b-c1a2aa9789c5" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.515390 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.517792 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.517856 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.521204 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.524039 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.524056 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.551552 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8"] Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.602685 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.602801 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.602844 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b796cd80-c3e7-428e-a090-1569637819e8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.602901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.602974 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jzvj\" (UniqueName: \"kubernetes.io/projected/b796cd80-c3e7-428e-a090-1569637819e8-kube-api-access-8jzvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.704153 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.704523 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.704683 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b796cd80-c3e7-428e-a090-1569637819e8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.704821 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.704982 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jzvj\" (UniqueName: \"kubernetes.io/projected/b796cd80-c3e7-428e-a090-1569637819e8-kube-api-access-8jzvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.705515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b796cd80-c3e7-428e-a090-1569637819e8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.708767 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.709229 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.714086 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.722066 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jzvj\" (UniqueName: \"kubernetes.io/projected/b796cd80-c3e7-428e-a090-1569637819e8-kube-api-access-8jzvj\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h29r8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:16 crc kubenswrapper[4907]: I0226 16:19:16.829395 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:19:17 crc kubenswrapper[4907]: I0226 16:19:17.402904 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8"] Feb 26 16:19:18 crc kubenswrapper[4907]: I0226 16:19:18.419163 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" event={"ID":"b796cd80-c3e7-428e-a090-1569637819e8","Type":"ContainerStarted","Data":"23c5f81c3ec6d958741e03dc12fee81a14233c7bf5e14820eea9481385761325"} Feb 26 16:19:18 crc kubenswrapper[4907]: I0226 16:19:18.420207 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" event={"ID":"b796cd80-c3e7-428e-a090-1569637819e8","Type":"ContainerStarted","Data":"a268e7338a81df1605f6cf5e0a0280ad9aa14eea76f478b328d3a7fdda8ddd5a"} Feb 26 16:19:18 crc kubenswrapper[4907]: I0226 16:19:18.439863 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" podStartSLOduration=1.972143878 podStartE2EDuration="2.439833377s" podCreationTimestamp="2026-02-26 16:19:16 +0000 UTC" firstStartedPulling="2026-02-26 16:19:17.408853936 +0000 UTC m=+2219.927415786" lastFinishedPulling="2026-02-26 16:19:17.876543436 +0000 UTC m=+2220.395105285" observedRunningTime="2026-02-26 16:19:18.433448752 +0000 UTC m=+2220.952010611" watchObservedRunningTime="2026-02-26 16:19:18.439833377 +0000 UTC m=+2220.958395226" Feb 26 16:19:18 crc kubenswrapper[4907]: I0226 16:19:18.530784 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:19:18 crc kubenswrapper[4907]: I0226 16:19:18.530873 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:19:19 crc kubenswrapper[4907]: I0226 16:19:19.176056 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:19:19 crc kubenswrapper[4907]: I0226 16:19:19.176108 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:19:20 crc kubenswrapper[4907]: I0226 16:19:20.231239 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-98mbl" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="registry-server" probeResult="failure" output=< Feb 26 16:19:20 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:19:20 crc kubenswrapper[4907]: > Feb 26 16:19:29 crc kubenswrapper[4907]: I0226 16:19:29.237337 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:19:29 crc kubenswrapper[4907]: I0226 16:19:29.301930 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:19:30 crc kubenswrapper[4907]: I0226 16:19:30.055726 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98mbl"] Feb 26 16:19:30 crc kubenswrapper[4907]: I0226 16:19:30.521004 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98mbl" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="registry-server" containerID="cri-o://169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e" gracePeriod=2 Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.039372 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.174984 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-catalog-content\") pod \"5cba1d8d-90b5-4317-8783-e015167f210a\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.175126 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dbwd\" (UniqueName: \"kubernetes.io/projected/5cba1d8d-90b5-4317-8783-e015167f210a-kube-api-access-4dbwd\") pod \"5cba1d8d-90b5-4317-8783-e015167f210a\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.175157 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-utilities\") pod \"5cba1d8d-90b5-4317-8783-e015167f210a\" (UID: \"5cba1d8d-90b5-4317-8783-e015167f210a\") " Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.176153 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-utilities" (OuterVolumeSpecName: "utilities") pod "5cba1d8d-90b5-4317-8783-e015167f210a" (UID: "5cba1d8d-90b5-4317-8783-e015167f210a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.186744 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cba1d8d-90b5-4317-8783-e015167f210a-kube-api-access-4dbwd" (OuterVolumeSpecName: "kube-api-access-4dbwd") pod "5cba1d8d-90b5-4317-8783-e015167f210a" (UID: "5cba1d8d-90b5-4317-8783-e015167f210a"). InnerVolumeSpecName "kube-api-access-4dbwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.277792 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dbwd\" (UniqueName: \"kubernetes.io/projected/5cba1d8d-90b5-4317-8783-e015167f210a-kube-api-access-4dbwd\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.277841 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.323016 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cba1d8d-90b5-4317-8783-e015167f210a" (UID: "5cba1d8d-90b5-4317-8783-e015167f210a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.379332 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cba1d8d-90b5-4317-8783-e015167f210a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.531611 4907 generic.go:334] "Generic (PLEG): container finished" podID="5cba1d8d-90b5-4317-8783-e015167f210a" containerID="169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e" exitCode=0 Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.531649 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerDied","Data":"169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e"} Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.531674 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98mbl" event={"ID":"5cba1d8d-90b5-4317-8783-e015167f210a","Type":"ContainerDied","Data":"a6420d415d81708da25eb47ab5a67e3591131c6ca7f922c6c752c4a8c8ddd086"} Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.531691 4907 scope.go:117] "RemoveContainer" containerID="169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.531805 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98mbl" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.569954 4907 scope.go:117] "RemoveContainer" containerID="1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.573376 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98mbl"] Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.599960 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98mbl"] Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.600527 4907 scope.go:117] "RemoveContainer" containerID="aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.639729 4907 scope.go:117] "RemoveContainer" containerID="169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e" Feb 26 16:19:31 crc kubenswrapper[4907]: E0226 16:19:31.640219 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e\": container with ID starting with 169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e not found: ID does not exist" containerID="169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.640268 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e"} err="failed to get container status \"169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e\": rpc error: code = NotFound desc = could not find container \"169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e\": container with ID starting with 169a595c846cc79524611a17f6ed318f5bf5cf9c391371b3b5f09f544ffc905e not found: ID does not exist" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.640299 4907 scope.go:117] "RemoveContainer" containerID="1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35" Feb 26 16:19:31 crc kubenswrapper[4907]: E0226 16:19:31.640625 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35\": container with ID starting with 1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35 not found: ID does not exist" containerID="1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.640655 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35"} err="failed to get container status \"1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35\": rpc error: code = NotFound desc = could not find container \"1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35\": container with ID starting with 1bbf08239a9066dd5920669ee6d0a4a2dce4b10123d1d9d252e5f304c1368f35 not found: ID does not exist" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.640677 4907 scope.go:117] "RemoveContainer" containerID="aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf" Feb 26 16:19:31 crc kubenswrapper[4907]: E0226 16:19:31.640929 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf\": container with ID starting with aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf not found: ID does not exist" containerID="aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf" Feb 26 16:19:31 crc kubenswrapper[4907]: I0226 16:19:31.640959 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf"} err="failed to get container status \"aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf\": rpc error: code = NotFound desc = could not find container \"aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf\": container with ID starting with aeebc634579e6f573020496767229e60cbc08dd43cd92d5a57896152eb5a0ebf not found: ID does not exist" Feb 26 16:19:32 crc kubenswrapper[4907]: I0226 16:19:32.140245 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" path="/var/lib/kubelet/pods/5cba1d8d-90b5-4317-8783-e015167f210a/volumes" Feb 26 16:19:48 crc kubenswrapper[4907]: I0226 16:19:48.530173 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:19:48 crc kubenswrapper[4907]: I0226 16:19:48.530739 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.196543 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qb2mz"] Feb 26 16:19:58 crc kubenswrapper[4907]: E0226 16:19:58.197508 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="extract-utilities" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.197528 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="extract-utilities" Feb 26 16:19:58 crc kubenswrapper[4907]: E0226 16:19:58.197554 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="registry-server" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.197566 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="registry-server" Feb 26 16:19:58 crc kubenswrapper[4907]: E0226 16:19:58.197649 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="extract-content" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.197663 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="extract-content" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.197909 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cba1d8d-90b5-4317-8783-e015167f210a" containerName="registry-server" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.202537 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.210201 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb2mz"] Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.293340 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68q2\" (UniqueName: \"kubernetes.io/projected/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-kube-api-access-k68q2\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.293403 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-utilities\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.293472 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-catalog-content\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.394625 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k68q2\" (UniqueName: \"kubernetes.io/projected/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-kube-api-access-k68q2\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.395016 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-utilities\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.395377 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-utilities\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.395679 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-catalog-content\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.395710 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-catalog-content\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.417505 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68q2\" (UniqueName: \"kubernetes.io/projected/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-kube-api-access-k68q2\") pod \"redhat-marketplace-qb2mz\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:58 crc kubenswrapper[4907]: I0226 16:19:58.532172 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:19:59 crc kubenswrapper[4907]: I0226 16:19:59.173475 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb2mz"] Feb 26 16:19:59 crc kubenswrapper[4907]: I0226 16:19:59.785191 4907 generic.go:334] "Generic (PLEG): container finished" podID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerID="59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7" exitCode=0 Feb 26 16:19:59 crc kubenswrapper[4907]: I0226 16:19:59.785277 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerDied","Data":"59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7"} Feb 26 16:19:59 crc kubenswrapper[4907]: I0226 16:19:59.785326 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerStarted","Data":"f569a12c3ebf5bce627199d5d902ecf7ed4a636a772da0035f04ff943fa12257"} Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.152936 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535380-n7f96"] Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.154518 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.157974 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.158253 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.160645 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-n7f96"] Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.172704 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.234063 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4272\" (UniqueName: \"kubernetes.io/projected/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3-kube-api-access-g4272\") pod \"auto-csr-approver-29535380-n7f96\" (UID: \"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3\") " pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.335852 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4272\" (UniqueName: \"kubernetes.io/projected/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3-kube-api-access-g4272\") pod \"auto-csr-approver-29535380-n7f96\" (UID: \"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3\") " pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.356717 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4272\" (UniqueName: \"kubernetes.io/projected/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3-kube-api-access-g4272\") pod \"auto-csr-approver-29535380-n7f96\" (UID: \"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3\") " pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.478144 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:00 crc kubenswrapper[4907]: I0226 16:20:00.957620 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-n7f96"] Feb 26 16:20:00 crc kubenswrapper[4907]: W0226 16:20:00.970526 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e8ff52_ea04_4294_a6bb_4ec86d328fd3.slice/crio-9cee2891e095ca6e6c8f7b980bb015c908766ed0ff937e2e93b21234f5e96f28 WatchSource:0}: Error finding container 9cee2891e095ca6e6c8f7b980bb015c908766ed0ff937e2e93b21234f5e96f28: Status 404 returned error can't find the container with id 9cee2891e095ca6e6c8f7b980bb015c908766ed0ff937e2e93b21234f5e96f28 Feb 26 16:20:01 crc kubenswrapper[4907]: I0226 16:20:01.802052 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerStarted","Data":"0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602"} Feb 26 16:20:01 crc kubenswrapper[4907]: I0226 16:20:01.803801 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-n7f96" event={"ID":"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3","Type":"ContainerStarted","Data":"9cee2891e095ca6e6c8f7b980bb015c908766ed0ff937e2e93b21234f5e96f28"} Feb 26 16:20:02 crc kubenswrapper[4907]: I0226 16:20:02.814388 4907 generic.go:334] "Generic (PLEG): container finished" podID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerID="0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602" exitCode=0 Feb 26 16:20:02 crc kubenswrapper[4907]: I0226 16:20:02.814459 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerDied","Data":"0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602"} Feb 26 16:20:02 crc kubenswrapper[4907]: I0226 16:20:02.819151 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-n7f96" event={"ID":"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3","Type":"ContainerStarted","Data":"e0212c7e07f7d8cb33491c0404568acab4c95f21e7f235ee87f59f2567c6f7ba"} Feb 26 16:20:02 crc kubenswrapper[4907]: I0226 16:20:02.854906 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535380-n7f96" podStartSLOduration=1.2916753939999999 podStartE2EDuration="2.854863655s" podCreationTimestamp="2026-02-26 16:20:00 +0000 UTC" firstStartedPulling="2026-02-26 16:20:00.972928024 +0000 UTC m=+2263.491489873" lastFinishedPulling="2026-02-26 16:20:02.536116285 +0000 UTC m=+2265.054678134" observedRunningTime="2026-02-26 16:20:02.853954093 +0000 UTC m=+2265.372515962" watchObservedRunningTime="2026-02-26 16:20:02.854863655 +0000 UTC m=+2265.373425514" Feb 26 16:20:03 crc kubenswrapper[4907]: I0226 16:20:03.833205 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerStarted","Data":"e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac"} Feb 26 16:20:04 crc kubenswrapper[4907]: I0226 16:20:04.859746 4907 generic.go:334] "Generic (PLEG): container finished" podID="d1e8ff52-ea04-4294-a6bb-4ec86d328fd3" containerID="e0212c7e07f7d8cb33491c0404568acab4c95f21e7f235ee87f59f2567c6f7ba" exitCode=0 Feb 26 16:20:04 crc kubenswrapper[4907]: I0226 16:20:04.860250 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-n7f96" event={"ID":"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3","Type":"ContainerDied","Data":"e0212c7e07f7d8cb33491c0404568acab4c95f21e7f235ee87f59f2567c6f7ba"} Feb 26 16:20:04 crc kubenswrapper[4907]: I0226 16:20:04.876293 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qb2mz" podStartSLOduration=3.398498692 podStartE2EDuration="6.876269175s" podCreationTimestamp="2026-02-26 16:19:58 +0000 UTC" firstStartedPulling="2026-02-26 16:19:59.787296503 +0000 UTC m=+2262.305858382" lastFinishedPulling="2026-02-26 16:20:03.265067016 +0000 UTC m=+2265.783628865" observedRunningTime="2026-02-26 16:20:03.864639807 +0000 UTC m=+2266.383201686" watchObservedRunningTime="2026-02-26 16:20:04.876269175 +0000 UTC m=+2267.394831024" Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.342225 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.483037 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4272\" (UniqueName: \"kubernetes.io/projected/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3-kube-api-access-g4272\") pod \"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3\" (UID: \"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3\") " Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.490353 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3-kube-api-access-g4272" (OuterVolumeSpecName: "kube-api-access-g4272") pod "d1e8ff52-ea04-4294-a6bb-4ec86d328fd3" (UID: "d1e8ff52-ea04-4294-a6bb-4ec86d328fd3"). InnerVolumeSpecName "kube-api-access-g4272". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.585673 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4272\" (UniqueName: \"kubernetes.io/projected/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3-kube-api-access-g4272\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.878908 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535380-n7f96" event={"ID":"d1e8ff52-ea04-4294-a6bb-4ec86d328fd3","Type":"ContainerDied","Data":"9cee2891e095ca6e6c8f7b980bb015c908766ed0ff937e2e93b21234f5e96f28"} Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.879222 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cee2891e095ca6e6c8f7b980bb015c908766ed0ff937e2e93b21234f5e96f28" Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.878966 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535380-n7f96" Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.947175 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-xvjqq"] Feb 26 16:20:06 crc kubenswrapper[4907]: I0226 16:20:06.956896 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535374-xvjqq"] Feb 26 16:20:08 crc kubenswrapper[4907]: I0226 16:20:08.138424 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877489c3-3906-4d08-b2cf-e3245aeeec08" path="/var/lib/kubelet/pods/877489c3-3906-4d08-b2cf-e3245aeeec08/volumes" Feb 26 16:20:08 crc kubenswrapper[4907]: I0226 16:20:08.533975 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:20:08 crc kubenswrapper[4907]: I0226 16:20:08.534034 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:20:08 crc kubenswrapper[4907]: I0226 16:20:08.586457 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:20:08 crc kubenswrapper[4907]: I0226 16:20:08.950350 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:20:08 crc kubenswrapper[4907]: I0226 16:20:08.993167 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb2mz"] Feb 26 16:20:10 crc kubenswrapper[4907]: I0226 16:20:10.757548 4907 scope.go:117] "RemoveContainer" containerID="fafd9fbd6f5d5ee7d53958c894d99d720d9b3d52248b7e865eb306bbe5213097" Feb 26 16:20:10 crc kubenswrapper[4907]: I0226 16:20:10.912428 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qb2mz" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="registry-server" containerID="cri-o://e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac" gracePeriod=2 Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.429059 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.581104 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k68q2\" (UniqueName: \"kubernetes.io/projected/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-kube-api-access-k68q2\") pod \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.581648 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-catalog-content\") pod \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.581693 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-utilities\") pod \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\" (UID: \"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0\") " Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.582479 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-utilities" (OuterVolumeSpecName: "utilities") pod "710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" (UID: "710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.591854 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-kube-api-access-k68q2" (OuterVolumeSpecName: "kube-api-access-k68q2") pod "710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" (UID: "710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0"). InnerVolumeSpecName "kube-api-access-k68q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.610332 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" (UID: "710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.684086 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.684134 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.684144 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k68q2\" (UniqueName: \"kubernetes.io/projected/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0-kube-api-access-k68q2\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.921186 4907 generic.go:334] "Generic (PLEG): container finished" podID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerID="e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac" exitCode=0 Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.921233 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerDied","Data":"e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac"} Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.921241 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qb2mz" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.921262 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qb2mz" event={"ID":"710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0","Type":"ContainerDied","Data":"f569a12c3ebf5bce627199d5d902ecf7ed4a636a772da0035f04ff943fa12257"} Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.921279 4907 scope.go:117] "RemoveContainer" containerID="e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.951861 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb2mz"] Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.955235 4907 scope.go:117] "RemoveContainer" containerID="0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602" Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.964183 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qb2mz"] Feb 26 16:20:11 crc kubenswrapper[4907]: I0226 16:20:11.977973 4907 scope.go:117] "RemoveContainer" containerID="59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.025464 4907 scope.go:117] "RemoveContainer" containerID="e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac" Feb 26 16:20:12 crc kubenswrapper[4907]: E0226 16:20:12.025878 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac\": container with ID starting with e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac not found: ID does not exist" containerID="e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.025943 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac"} err="failed to get container status \"e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac\": rpc error: code = NotFound desc = could not find container \"e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac\": container with ID starting with e5764e5f249e5fc6fa1d5d5c9e12d93dfa015a8be20722a606cdb4b74dcd81ac not found: ID does not exist" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.025967 4907 scope.go:117] "RemoveContainer" containerID="0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602" Feb 26 16:20:12 crc kubenswrapper[4907]: E0226 16:20:12.026203 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602\": container with ID starting with 0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602 not found: ID does not exist" containerID="0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.026224 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602"} err="failed to get container status \"0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602\": rpc error: code = NotFound desc = could not find container \"0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602\": container with ID starting with 0b61719aa1dbf567f42859d41588e245a620cbfb705a134f56b7077fbe242602 not found: ID does not exist" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.026237 4907 scope.go:117] "RemoveContainer" containerID="59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7" Feb 26 16:20:12 crc kubenswrapper[4907]: E0226 16:20:12.026459 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7\": container with ID starting with 59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7 not found: ID does not exist" containerID="59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.026500 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7"} err="failed to get container status \"59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7\": rpc error: code = NotFound desc = could not find container \"59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7\": container with ID starting with 59e026e378caef689bc26dffdb23b076537b212651010f51f66fe2eada1411e7 not found: ID does not exist" Feb 26 16:20:12 crc kubenswrapper[4907]: I0226 16:20:12.136326 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" path="/var/lib/kubelet/pods/710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0/volumes" Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.530340 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.531773 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.531856 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.532708 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.532768 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" gracePeriod=600 Feb 26 16:20:18 crc kubenswrapper[4907]: E0226 16:20:18.659964 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.994976 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" exitCode=0 Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.995090 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085"} Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.995559 4907 scope.go:117] "RemoveContainer" containerID="eeafebf90768294d93b5a754d4be3f7e7e83781774c84e4b268744314a564bb2" Feb 26 16:20:18 crc kubenswrapper[4907]: I0226 16:20:18.996427 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:20:18 crc kubenswrapper[4907]: E0226 16:20:18.996778 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:20:23 crc kubenswrapper[4907]: I0226 16:20:23.037157 4907 generic.go:334] "Generic (PLEG): container finished" podID="b796cd80-c3e7-428e-a090-1569637819e8" containerID="23c5f81c3ec6d958741e03dc12fee81a14233c7bf5e14820eea9481385761325" exitCode=0 Feb 26 16:20:23 crc kubenswrapper[4907]: I0226 16:20:23.037257 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" event={"ID":"b796cd80-c3e7-428e-a090-1569637819e8","Type":"ContainerDied","Data":"23c5f81c3ec6d958741e03dc12fee81a14233c7bf5e14820eea9481385761325"} Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.592645 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.768859 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ssh-key-openstack-edpm-ipam\") pod \"b796cd80-c3e7-428e-a090-1569637819e8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.768912 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-inventory\") pod \"b796cd80-c3e7-428e-a090-1569637819e8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.768981 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ovn-combined-ca-bundle\") pod \"b796cd80-c3e7-428e-a090-1569637819e8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.769097 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b796cd80-c3e7-428e-a090-1569637819e8-ovncontroller-config-0\") pod \"b796cd80-c3e7-428e-a090-1569637819e8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.769233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jzvj\" (UniqueName: \"kubernetes.io/projected/b796cd80-c3e7-428e-a090-1569637819e8-kube-api-access-8jzvj\") pod \"b796cd80-c3e7-428e-a090-1569637819e8\" (UID: \"b796cd80-c3e7-428e-a090-1569637819e8\") " Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.778112 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b796cd80-c3e7-428e-a090-1569637819e8-kube-api-access-8jzvj" (OuterVolumeSpecName: "kube-api-access-8jzvj") pod "b796cd80-c3e7-428e-a090-1569637819e8" (UID: "b796cd80-c3e7-428e-a090-1569637819e8"). InnerVolumeSpecName "kube-api-access-8jzvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.792040 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "b796cd80-c3e7-428e-a090-1569637819e8" (UID: "b796cd80-c3e7-428e-a090-1569637819e8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.832977 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-inventory" (OuterVolumeSpecName: "inventory") pod "b796cd80-c3e7-428e-a090-1569637819e8" (UID: "b796cd80-c3e7-428e-a090-1569637819e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.850945 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b796cd80-c3e7-428e-a090-1569637819e8-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "b796cd80-c3e7-428e-a090-1569637819e8" (UID: "b796cd80-c3e7-428e-a090-1569637819e8"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.854826 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b796cd80-c3e7-428e-a090-1569637819e8" (UID: "b796cd80-c3e7-428e-a090-1569637819e8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.872845 4907 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/b796cd80-c3e7-428e-a090-1569637819e8-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.872884 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jzvj\" (UniqueName: \"kubernetes.io/projected/b796cd80-c3e7-428e-a090-1569637819e8-kube-api-access-8jzvj\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.872896 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.872909 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:24 crc kubenswrapper[4907]: I0226 16:20:24.872923 4907 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b796cd80-c3e7-428e-a090-1569637819e8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.054281 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" event={"ID":"b796cd80-c3e7-428e-a090-1569637819e8","Type":"ContainerDied","Data":"a268e7338a81df1605f6cf5e0a0280ad9aa14eea76f478b328d3a7fdda8ddd5a"} Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.054331 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a268e7338a81df1605f6cf5e0a0280ad9aa14eea76f478b328d3a7fdda8ddd5a" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.054395 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h29r8" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.165208 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw"] Feb 26 16:20:25 crc kubenswrapper[4907]: E0226 16:20:25.165769 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="extract-utilities" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.165793 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="extract-utilities" Feb 26 16:20:25 crc kubenswrapper[4907]: E0226 16:20:25.165810 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="registry-server" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.165819 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="registry-server" Feb 26 16:20:25 crc kubenswrapper[4907]: E0226 16:20:25.165833 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e8ff52-ea04-4294-a6bb-4ec86d328fd3" containerName="oc" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.165841 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e8ff52-ea04-4294-a6bb-4ec86d328fd3" containerName="oc" Feb 26 16:20:25 crc kubenswrapper[4907]: E0226 16:20:25.165879 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b796cd80-c3e7-428e-a090-1569637819e8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.165888 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b796cd80-c3e7-428e-a090-1569637819e8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 16:20:25 crc kubenswrapper[4907]: E0226 16:20:25.165915 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="extract-content" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.165925 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="extract-content" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.166136 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b796cd80-c3e7-428e-a090-1569637819e8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.166169 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e8ff52-ea04-4294-a6bb-4ec86d328fd3" containerName="oc" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.166185 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="710c0f5a-fe3f-4d3b-8d2a-7afdbcfde5c0" containerName="registry-server" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.174985 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.177752 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.178691 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.178857 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw"] Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.178866 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.178897 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.183272 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.183572 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.280871 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.281231 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.281315 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.281533 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mb5c\" (UniqueName: \"kubernetes.io/projected/ae4ed9f9-3638-491a-8467-0035443468c1-kube-api-access-7mb5c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.281605 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.281651 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.384059 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.384111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.384178 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mb5c\" (UniqueName: \"kubernetes.io/projected/ae4ed9f9-3638-491a-8467-0035443468c1-kube-api-access-7mb5c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.384208 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.384237 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.384305 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.388708 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.388761 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.388708 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.394515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.395232 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.401304 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mb5c\" (UniqueName: \"kubernetes.io/projected/ae4ed9f9-3638-491a-8467-0035443468c1-kube-api-access-7mb5c\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:25 crc kubenswrapper[4907]: I0226 16:20:25.500842 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:20:26 crc kubenswrapper[4907]: I0226 16:20:26.056866 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw"] Feb 26 16:20:27 crc kubenswrapper[4907]: I0226 16:20:27.072028 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" event={"ID":"ae4ed9f9-3638-491a-8467-0035443468c1","Type":"ContainerStarted","Data":"94ac26a2f9bbefeae95da9fc5e1e7a41ae692f0185170f43e5467b72683a165f"} Feb 26 16:20:27 crc kubenswrapper[4907]: I0226 16:20:27.072327 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" event={"ID":"ae4ed9f9-3638-491a-8467-0035443468c1","Type":"ContainerStarted","Data":"970920b59a7d2dae94f6989a3db7816ce61f0b6e1f1c1a1ffdd7efcad8aca92a"} Feb 26 16:20:27 crc kubenswrapper[4907]: I0226 16:20:27.095410 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" podStartSLOduration=1.622003393 podStartE2EDuration="2.095394842s" podCreationTimestamp="2026-02-26 16:20:25 +0000 UTC" firstStartedPulling="2026-02-26 16:20:26.092412997 +0000 UTC m=+2288.610974846" lastFinishedPulling="2026-02-26 16:20:26.565804446 +0000 UTC m=+2289.084366295" observedRunningTime="2026-02-26 16:20:27.094603322 +0000 UTC m=+2289.613165171" watchObservedRunningTime="2026-02-26 16:20:27.095394842 +0000 UTC m=+2289.613956691" Feb 26 16:20:32 crc kubenswrapper[4907]: I0226 16:20:32.126811 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:20:32 crc kubenswrapper[4907]: E0226 16:20:32.127761 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:20:45 crc kubenswrapper[4907]: I0226 16:20:45.126692 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:20:45 crc kubenswrapper[4907]: E0226 16:20:45.127409 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.127018 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:20:57 crc kubenswrapper[4907]: E0226 16:20:57.127863 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.384798 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8sk8z"] Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.386675 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.449397 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8sk8z"] Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.514577 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2dv\" (UniqueName: \"kubernetes.io/projected/f63e1b60-6723-4664-882c-d3e758b0c237-kube-api-access-qp2dv\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.514714 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-catalog-content\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.514970 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-utilities\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.618024 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-utilities\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.618218 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2dv\" (UniqueName: \"kubernetes.io/projected/f63e1b60-6723-4664-882c-d3e758b0c237-kube-api-access-qp2dv\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.618256 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-catalog-content\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.618607 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-utilities\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.618989 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-catalog-content\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.639428 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2dv\" (UniqueName: \"kubernetes.io/projected/f63e1b60-6723-4664-882c-d3e758b0c237-kube-api-access-qp2dv\") pod \"certified-operators-8sk8z\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:57 crc kubenswrapper[4907]: I0226 16:20:57.712426 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:20:58 crc kubenswrapper[4907]: I0226 16:20:58.307123 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8sk8z"] Feb 26 16:20:58 crc kubenswrapper[4907]: I0226 16:20:58.385101 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerStarted","Data":"46482927ec8e63dff7cc65a4154ca5e0bce92a049012c6985329500fdaa6daf5"} Feb 26 16:20:59 crc kubenswrapper[4907]: I0226 16:20:59.393196 4907 generic.go:334] "Generic (PLEG): container finished" podID="f63e1b60-6723-4664-882c-d3e758b0c237" containerID="6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66" exitCode=0 Feb 26 16:20:59 crc kubenswrapper[4907]: I0226 16:20:59.393556 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerDied","Data":"6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66"} Feb 26 16:21:00 crc kubenswrapper[4907]: I0226 16:21:00.403632 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerStarted","Data":"d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd"} Feb 26 16:21:02 crc kubenswrapper[4907]: I0226 16:21:02.421981 4907 generic.go:334] "Generic (PLEG): container finished" podID="f63e1b60-6723-4664-882c-d3e758b0c237" containerID="d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd" exitCode=0 Feb 26 16:21:02 crc kubenswrapper[4907]: I0226 16:21:02.422059 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerDied","Data":"d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd"} Feb 26 16:21:03 crc kubenswrapper[4907]: I0226 16:21:03.433714 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerStarted","Data":"8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12"} Feb 26 16:21:03 crc kubenswrapper[4907]: I0226 16:21:03.453081 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8sk8z" podStartSLOduration=2.983581452 podStartE2EDuration="6.453060071s" podCreationTimestamp="2026-02-26 16:20:57 +0000 UTC" firstStartedPulling="2026-02-26 16:20:59.394946149 +0000 UTC m=+2321.913507988" lastFinishedPulling="2026-02-26 16:21:02.864424738 +0000 UTC m=+2325.382986607" observedRunningTime="2026-02-26 16:21:03.45095948 +0000 UTC m=+2325.969521349" watchObservedRunningTime="2026-02-26 16:21:03.453060071 +0000 UTC m=+2325.971621920" Feb 26 16:21:07 crc kubenswrapper[4907]: I0226 16:21:07.713622 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:21:07 crc kubenswrapper[4907]: I0226 16:21:07.714057 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:21:07 crc kubenswrapper[4907]: I0226 16:21:07.789680 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:21:08 crc kubenswrapper[4907]: I0226 16:21:08.545791 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:21:08 crc kubenswrapper[4907]: I0226 16:21:08.605537 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8sk8z"] Feb 26 16:21:09 crc kubenswrapper[4907]: I0226 16:21:09.126579 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:21:09 crc kubenswrapper[4907]: E0226 16:21:09.126869 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:21:10 crc kubenswrapper[4907]: I0226 16:21:10.500097 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8sk8z" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="registry-server" containerID="cri-o://8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12" gracePeriod=2 Feb 26 16:21:10 crc kubenswrapper[4907]: I0226 16:21:10.931583 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.074022 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-catalog-content\") pod \"f63e1b60-6723-4664-882c-d3e758b0c237\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.074778 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-utilities\") pod \"f63e1b60-6723-4664-882c-d3e758b0c237\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.075502 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-utilities" (OuterVolumeSpecName: "utilities") pod "f63e1b60-6723-4664-882c-d3e758b0c237" (UID: "f63e1b60-6723-4664-882c-d3e758b0c237"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.075882 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2dv\" (UniqueName: \"kubernetes.io/projected/f63e1b60-6723-4664-882c-d3e758b0c237-kube-api-access-qp2dv\") pod \"f63e1b60-6723-4664-882c-d3e758b0c237\" (UID: \"f63e1b60-6723-4664-882c-d3e758b0c237\") " Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.076517 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.082903 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f63e1b60-6723-4664-882c-d3e758b0c237-kube-api-access-qp2dv" (OuterVolumeSpecName: "kube-api-access-qp2dv") pod "f63e1b60-6723-4664-882c-d3e758b0c237" (UID: "f63e1b60-6723-4664-882c-d3e758b0c237"). InnerVolumeSpecName "kube-api-access-qp2dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.140159 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f63e1b60-6723-4664-882c-d3e758b0c237" (UID: "f63e1b60-6723-4664-882c-d3e758b0c237"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.177933 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f63e1b60-6723-4664-882c-d3e758b0c237-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.177969 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2dv\" (UniqueName: \"kubernetes.io/projected/f63e1b60-6723-4664-882c-d3e758b0c237-kube-api-access-qp2dv\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.510464 4907 generic.go:334] "Generic (PLEG): container finished" podID="f63e1b60-6723-4664-882c-d3e758b0c237" containerID="8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12" exitCode=0 Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.510653 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8sk8z" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.510674 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerDied","Data":"8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12"} Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.511906 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8sk8z" event={"ID":"f63e1b60-6723-4664-882c-d3e758b0c237","Type":"ContainerDied","Data":"46482927ec8e63dff7cc65a4154ca5e0bce92a049012c6985329500fdaa6daf5"} Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.511948 4907 scope.go:117] "RemoveContainer" containerID="8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.535449 4907 scope.go:117] "RemoveContainer" containerID="d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.557954 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8sk8z"] Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.564032 4907 scope.go:117] "RemoveContainer" containerID="6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.569913 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8sk8z"] Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.601234 4907 scope.go:117] "RemoveContainer" containerID="8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12" Feb 26 16:21:11 crc kubenswrapper[4907]: E0226 16:21:11.601742 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12\": container with ID starting with 8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12 not found: ID does not exist" containerID="8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.601794 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12"} err="failed to get container status \"8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12\": rpc error: code = NotFound desc = could not find container \"8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12\": container with ID starting with 8c8a784f30c8868c277075967159749481918dcc2e2481bdeff2d968339b9c12 not found: ID does not exist" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.601821 4907 scope.go:117] "RemoveContainer" containerID="d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd" Feb 26 16:21:11 crc kubenswrapper[4907]: E0226 16:21:11.602141 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd\": container with ID starting with d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd not found: ID does not exist" containerID="d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.602176 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd"} err="failed to get container status \"d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd\": rpc error: code = NotFound desc = could not find container \"d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd\": container with ID starting with d9f5a5f195e7991417ec088b9a95be76fade88aa883fea66606a0c071538b5dd not found: ID does not exist" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.602199 4907 scope.go:117] "RemoveContainer" containerID="6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66" Feb 26 16:21:11 crc kubenswrapper[4907]: E0226 16:21:11.602509 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66\": container with ID starting with 6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66 not found: ID does not exist" containerID="6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66" Feb 26 16:21:11 crc kubenswrapper[4907]: I0226 16:21:11.602536 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66"} err="failed to get container status \"6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66\": rpc error: code = NotFound desc = could not find container \"6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66\": container with ID starting with 6d08106520a3d557491c3c56d31ba3befdf82492bd05f14a690e4fd74bf81d66 not found: ID does not exist" Feb 26 16:21:12 crc kubenswrapper[4907]: I0226 16:21:12.136948 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" path="/var/lib/kubelet/pods/f63e1b60-6723-4664-882c-d3e758b0c237/volumes" Feb 26 16:21:14 crc kubenswrapper[4907]: I0226 16:21:14.547828 4907 generic.go:334] "Generic (PLEG): container finished" podID="ae4ed9f9-3638-491a-8467-0035443468c1" containerID="94ac26a2f9bbefeae95da9fc5e1e7a41ae692f0185170f43e5467b72683a165f" exitCode=0 Feb 26 16:21:14 crc kubenswrapper[4907]: I0226 16:21:14.547939 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" event={"ID":"ae4ed9f9-3638-491a-8467-0035443468c1","Type":"ContainerDied","Data":"94ac26a2f9bbefeae95da9fc5e1e7a41ae692f0185170f43e5467b72683a165f"} Feb 26 16:21:15 crc kubenswrapper[4907]: I0226 16:21:15.942514 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.108950 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ae4ed9f9-3638-491a-8467-0035443468c1\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.109022 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-metadata-combined-ca-bundle\") pod \"ae4ed9f9-3638-491a-8467-0035443468c1\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.109130 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-nova-metadata-neutron-config-0\") pod \"ae4ed9f9-3638-491a-8467-0035443468c1\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.109169 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mb5c\" (UniqueName: \"kubernetes.io/projected/ae4ed9f9-3638-491a-8467-0035443468c1-kube-api-access-7mb5c\") pod \"ae4ed9f9-3638-491a-8467-0035443468c1\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.109210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-inventory\") pod \"ae4ed9f9-3638-491a-8467-0035443468c1\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.109295 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-ssh-key-openstack-edpm-ipam\") pod \"ae4ed9f9-3638-491a-8467-0035443468c1\" (UID: \"ae4ed9f9-3638-491a-8467-0035443468c1\") " Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.123841 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ae4ed9f9-3638-491a-8467-0035443468c1" (UID: "ae4ed9f9-3638-491a-8467-0035443468c1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.129863 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae4ed9f9-3638-491a-8467-0035443468c1-kube-api-access-7mb5c" (OuterVolumeSpecName: "kube-api-access-7mb5c") pod "ae4ed9f9-3638-491a-8467-0035443468c1" (UID: "ae4ed9f9-3638-491a-8467-0035443468c1"). InnerVolumeSpecName "kube-api-access-7mb5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.142357 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ae4ed9f9-3638-491a-8467-0035443468c1" (UID: "ae4ed9f9-3638-491a-8467-0035443468c1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.143880 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae4ed9f9-3638-491a-8467-0035443468c1" (UID: "ae4ed9f9-3638-491a-8467-0035443468c1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.151471 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ae4ed9f9-3638-491a-8467-0035443468c1" (UID: "ae4ed9f9-3638-491a-8467-0035443468c1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.154473 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-inventory" (OuterVolumeSpecName: "inventory") pod "ae4ed9f9-3638-491a-8467-0035443468c1" (UID: "ae4ed9f9-3638-491a-8467-0035443468c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.211421 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.211450 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.211461 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.211471 4907 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.211480 4907 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ae4ed9f9-3638-491a-8467-0035443468c1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.211489 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mb5c\" (UniqueName: \"kubernetes.io/projected/ae4ed9f9-3638-491a-8467-0035443468c1-kube-api-access-7mb5c\") on node \"crc\" DevicePath \"\"" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.571767 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" event={"ID":"ae4ed9f9-3638-491a-8467-0035443468c1","Type":"ContainerDied","Data":"970920b59a7d2dae94f6989a3db7816ce61f0b6e1f1c1a1ffdd7efcad8aca92a"} Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.572141 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970920b59a7d2dae94f6989a3db7816ce61f0b6e1f1c1a1ffdd7efcad8aca92a" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.571795 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.887343 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft"] Feb 26 16:21:16 crc kubenswrapper[4907]: E0226 16:21:16.887763 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="extract-content" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.887782 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="extract-content" Feb 26 16:21:16 crc kubenswrapper[4907]: E0226 16:21:16.887805 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae4ed9f9-3638-491a-8467-0035443468c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.887812 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae4ed9f9-3638-491a-8467-0035443468c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 16:21:16 crc kubenswrapper[4907]: E0226 16:21:16.887823 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="extract-utilities" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.887831 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="extract-utilities" Feb 26 16:21:16 crc kubenswrapper[4907]: E0226 16:21:16.887850 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="registry-server" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.887857 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="registry-server" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.888024 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="f63e1b60-6723-4664-882c-d3e758b0c237" containerName="registry-server" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.888041 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae4ed9f9-3638-491a-8467-0035443468c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.888641 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.890983 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.891452 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.891791 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.893236 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.896320 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 26 16:21:16 crc kubenswrapper[4907]: I0226 16:21:16.930323 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft"] Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.026860 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnk2p\" (UniqueName: \"kubernetes.io/projected/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-kube-api-access-gnk2p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.026901 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.026925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.027147 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.027273 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.129179 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.129650 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.129965 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnk2p\" (UniqueName: \"kubernetes.io/projected/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-kube-api-access-gnk2p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.130118 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.130266 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.135569 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.136155 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.138242 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.144779 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.159085 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnk2p\" (UniqueName: \"kubernetes.io/projected/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-kube-api-access-gnk2p\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.205161 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:21:17 crc kubenswrapper[4907]: I0226 16:21:17.711746 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft"] Feb 26 16:21:18 crc kubenswrapper[4907]: I0226 16:21:18.341279 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:21:18 crc kubenswrapper[4907]: I0226 16:21:18.597777 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" event={"ID":"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3","Type":"ContainerStarted","Data":"a27063a65a90ecc0458dac7209cca26e3f9fca666c2fb3a91dbb942ac1cea348"} Feb 26 16:21:18 crc kubenswrapper[4907]: I0226 16:21:18.597863 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" event={"ID":"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3","Type":"ContainerStarted","Data":"5647f29d3302994ad1a03a4dce31c30736a3f933e6bdfb33ea95c55c62524c10"} Feb 26 16:21:18 crc kubenswrapper[4907]: I0226 16:21:18.627959 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" podStartSLOduration=2.003288544 podStartE2EDuration="2.627939179s" podCreationTimestamp="2026-02-26 16:21:16 +0000 UTC" firstStartedPulling="2026-02-26 16:21:17.714300073 +0000 UTC m=+2340.232861922" lastFinishedPulling="2026-02-26 16:21:18.338950708 +0000 UTC m=+2340.857512557" observedRunningTime="2026-02-26 16:21:18.626619396 +0000 UTC m=+2341.145181285" watchObservedRunningTime="2026-02-26 16:21:18.627939179 +0000 UTC m=+2341.146501018" Feb 26 16:21:23 crc kubenswrapper[4907]: I0226 16:21:23.127183 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:21:23 crc kubenswrapper[4907]: E0226 16:21:23.127945 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:21:34 crc kubenswrapper[4907]: I0226 16:21:34.159909 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:21:34 crc kubenswrapper[4907]: E0226 16:21:34.160643 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:21:47 crc kubenswrapper[4907]: I0226 16:21:47.127400 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:21:47 crc kubenswrapper[4907]: E0226 16:21:47.128934 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.152605 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535382-2bh55"] Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.154543 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.160284 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.160846 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.160852 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.169146 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-2bh55"] Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.351532 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8khw\" (UniqueName: \"kubernetes.io/projected/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d-kube-api-access-g8khw\") pod \"auto-csr-approver-29535382-2bh55\" (UID: \"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d\") " pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.452803 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8khw\" (UniqueName: \"kubernetes.io/projected/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d-kube-api-access-g8khw\") pod \"auto-csr-approver-29535382-2bh55\" (UID: \"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d\") " pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.474239 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8khw\" (UniqueName: \"kubernetes.io/projected/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d-kube-api-access-g8khw\") pod \"auto-csr-approver-29535382-2bh55\" (UID: \"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d\") " pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.490808 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:00 crc kubenswrapper[4907]: I0226 16:22:00.955566 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-2bh55"] Feb 26 16:22:01 crc kubenswrapper[4907]: I0226 16:22:01.046011 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-2bh55" event={"ID":"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d","Type":"ContainerStarted","Data":"8a5745a07c859dcfdfbc504f8689ddd44795bbe7cf06d865ae33b670d86420eb"} Feb 26 16:22:02 crc kubenswrapper[4907]: I0226 16:22:02.126508 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:22:02 crc kubenswrapper[4907]: E0226 16:22:02.126960 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:22:03 crc kubenswrapper[4907]: I0226 16:22:03.067193 4907 generic.go:334] "Generic (PLEG): container finished" podID="838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d" containerID="015d8fbbf88748c71dfab4af205f314f429c130435d235bff9838b633676221f" exitCode=0 Feb 26 16:22:03 crc kubenswrapper[4907]: I0226 16:22:03.067299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-2bh55" event={"ID":"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d","Type":"ContainerDied","Data":"015d8fbbf88748c71dfab4af205f314f429c130435d235bff9838b633676221f"} Feb 26 16:22:04 crc kubenswrapper[4907]: I0226 16:22:04.445078 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:04 crc kubenswrapper[4907]: I0226 16:22:04.563880 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8khw\" (UniqueName: \"kubernetes.io/projected/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d-kube-api-access-g8khw\") pod \"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d\" (UID: \"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d\") " Feb 26 16:22:04 crc kubenswrapper[4907]: I0226 16:22:04.586835 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d-kube-api-access-g8khw" (OuterVolumeSpecName: "kube-api-access-g8khw") pod "838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d" (UID: "838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d"). InnerVolumeSpecName "kube-api-access-g8khw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:22:04 crc kubenswrapper[4907]: I0226 16:22:04.666943 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8khw\" (UniqueName: \"kubernetes.io/projected/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d-kube-api-access-g8khw\") on node \"crc\" DevicePath \"\"" Feb 26 16:22:05 crc kubenswrapper[4907]: I0226 16:22:05.085526 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535382-2bh55" event={"ID":"838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d","Type":"ContainerDied","Data":"8a5745a07c859dcfdfbc504f8689ddd44795bbe7cf06d865ae33b670d86420eb"} Feb 26 16:22:05 crc kubenswrapper[4907]: I0226 16:22:05.085573 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a5745a07c859dcfdfbc504f8689ddd44795bbe7cf06d865ae33b670d86420eb" Feb 26 16:22:05 crc kubenswrapper[4907]: I0226 16:22:05.085577 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535382-2bh55" Feb 26 16:22:05 crc kubenswrapper[4907]: I0226 16:22:05.525246 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-m6v7w"] Feb 26 16:22:05 crc kubenswrapper[4907]: I0226 16:22:05.535445 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535376-m6v7w"] Feb 26 16:22:06 crc kubenswrapper[4907]: I0226 16:22:06.145155 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a" path="/var/lib/kubelet/pods/346c3ab7-df78-4a2e-ae6a-a9cbdcf8bd5a/volumes" Feb 26 16:22:10 crc kubenswrapper[4907]: I0226 16:22:10.872403 4907 scope.go:117] "RemoveContainer" containerID="baddc805566013c4c6da03687ebbcaa1e817bd9c35691d103fa12cefebb69abd" Feb 26 16:22:14 crc kubenswrapper[4907]: I0226 16:22:14.129342 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:22:14 crc kubenswrapper[4907]: E0226 16:22:14.130507 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:22:28 crc kubenswrapper[4907]: I0226 16:22:28.134885 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:22:28 crc kubenswrapper[4907]: E0226 16:22:28.135646 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:22:43 crc kubenswrapper[4907]: I0226 16:22:43.127316 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:22:43 crc kubenswrapper[4907]: E0226 16:22:43.128450 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:22:56 crc kubenswrapper[4907]: I0226 16:22:56.128044 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:22:56 crc kubenswrapper[4907]: E0226 16:22:56.129162 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:23:07 crc kubenswrapper[4907]: I0226 16:23:07.126428 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:23:07 crc kubenswrapper[4907]: E0226 16:23:07.127302 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:23:22 crc kubenswrapper[4907]: I0226 16:23:22.126875 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:23:22 crc kubenswrapper[4907]: E0226 16:23:22.129229 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:23:33 crc kubenswrapper[4907]: I0226 16:23:33.127301 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:23:33 crc kubenswrapper[4907]: E0226 16:23:33.128104 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:23:48 crc kubenswrapper[4907]: I0226 16:23:48.135357 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:23:48 crc kubenswrapper[4907]: E0226 16:23:48.138357 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.156810 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535384-lhdqq"] Feb 26 16:24:00 crc kubenswrapper[4907]: E0226 16:24:00.157680 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d" containerName="oc" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.157693 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d" containerName="oc" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.157883 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d" containerName="oc" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.158501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.163672 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.163986 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.167871 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.178712 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-lhdqq"] Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.186513 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5rcv\" (UniqueName: \"kubernetes.io/projected/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21-kube-api-access-r5rcv\") pod \"auto-csr-approver-29535384-lhdqq\" (UID: \"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21\") " pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.288403 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5rcv\" (UniqueName: \"kubernetes.io/projected/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21-kube-api-access-r5rcv\") pod \"auto-csr-approver-29535384-lhdqq\" (UID: \"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21\") " pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.310849 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5rcv\" (UniqueName: \"kubernetes.io/projected/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21-kube-api-access-r5rcv\") pod \"auto-csr-approver-29535384-lhdqq\" (UID: \"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21\") " pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.480835 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.950225 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-lhdqq"] Feb 26 16:24:00 crc kubenswrapper[4907]: W0226 16:24:00.957859 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fb2d7c1_1737_4c4c_8c42_fcf0bf406f21.slice/crio-85e7712504df352ebdff5def75094a4463329fcf4f44b716278b245211efb1f4 WatchSource:0}: Error finding container 85e7712504df352ebdff5def75094a4463329fcf4f44b716278b245211efb1f4: Status 404 returned error can't find the container with id 85e7712504df352ebdff5def75094a4463329fcf4f44b716278b245211efb1f4 Feb 26 16:24:00 crc kubenswrapper[4907]: I0226 16:24:00.959804 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:24:01 crc kubenswrapper[4907]: I0226 16:24:01.224704 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" event={"ID":"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21","Type":"ContainerStarted","Data":"85e7712504df352ebdff5def75094a4463329fcf4f44b716278b245211efb1f4"} Feb 26 16:24:02 crc kubenswrapper[4907]: I0226 16:24:02.127012 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:24:02 crc kubenswrapper[4907]: E0226 16:24:02.127412 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:24:03 crc kubenswrapper[4907]: I0226 16:24:03.245393 4907 generic.go:334] "Generic (PLEG): container finished" podID="5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21" containerID="80a64ed61aa8a30637a58770379e6089a6245b33d99252bde06feca920721411" exitCode=0 Feb 26 16:24:03 crc kubenswrapper[4907]: I0226 16:24:03.245494 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" event={"ID":"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21","Type":"ContainerDied","Data":"80a64ed61aa8a30637a58770379e6089a6245b33d99252bde06feca920721411"} Feb 26 16:24:04 crc kubenswrapper[4907]: I0226 16:24:04.588738 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:04 crc kubenswrapper[4907]: I0226 16:24:04.676373 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5rcv\" (UniqueName: \"kubernetes.io/projected/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21-kube-api-access-r5rcv\") pod \"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21\" (UID: \"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21\") " Feb 26 16:24:04 crc kubenswrapper[4907]: I0226 16:24:04.681735 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21-kube-api-access-r5rcv" (OuterVolumeSpecName: "kube-api-access-r5rcv") pod "5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21" (UID: "5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21"). InnerVolumeSpecName "kube-api-access-r5rcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:24:04 crc kubenswrapper[4907]: I0226 16:24:04.779999 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5rcv\" (UniqueName: \"kubernetes.io/projected/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21-kube-api-access-r5rcv\") on node \"crc\" DevicePath \"\"" Feb 26 16:24:05 crc kubenswrapper[4907]: I0226 16:24:05.264023 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" event={"ID":"5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21","Type":"ContainerDied","Data":"85e7712504df352ebdff5def75094a4463329fcf4f44b716278b245211efb1f4"} Feb 26 16:24:05 crc kubenswrapper[4907]: I0226 16:24:05.264324 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85e7712504df352ebdff5def75094a4463329fcf4f44b716278b245211efb1f4" Feb 26 16:24:05 crc kubenswrapper[4907]: I0226 16:24:05.264052 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535384-lhdqq" Feb 26 16:24:05 crc kubenswrapper[4907]: I0226 16:24:05.679314 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-pz6jl"] Feb 26 16:24:05 crc kubenswrapper[4907]: I0226 16:24:05.687687 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535378-pz6jl"] Feb 26 16:24:06 crc kubenswrapper[4907]: I0226 16:24:06.146112 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8437f994-5cf4-40bf-b425-300e97b74aed" path="/var/lib/kubelet/pods/8437f994-5cf4-40bf-b425-300e97b74aed/volumes" Feb 26 16:24:10 crc kubenswrapper[4907]: I0226 16:24:10.981287 4907 scope.go:117] "RemoveContainer" containerID="b4c2aba71af8a10b65fdb2c26b42db8cb361dd18def93a088e174c6e581fe7bb" Feb 26 16:24:14 crc kubenswrapper[4907]: I0226 16:24:14.127699 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:24:14 crc kubenswrapper[4907]: E0226 16:24:14.128447 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.480739 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v7fjg"] Feb 26 16:24:22 crc kubenswrapper[4907]: E0226 16:24:22.481681 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21" containerName="oc" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.481698 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21" containerName="oc" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.481887 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21" containerName="oc" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.483326 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.509569 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v7fjg"] Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.538819 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5k5\" (UniqueName: \"kubernetes.io/projected/92508cb8-80be-49e0-a44b-fed640ad2c3a-kube-api-access-bh5k5\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.538892 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-catalog-content\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.538911 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-utilities\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.640301 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-catalog-content\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.640340 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-utilities\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.640513 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5k5\" (UniqueName: \"kubernetes.io/projected/92508cb8-80be-49e0-a44b-fed640ad2c3a-kube-api-access-bh5k5\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.640929 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-utilities\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.640925 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-catalog-content\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.661328 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5k5\" (UniqueName: \"kubernetes.io/projected/92508cb8-80be-49e0-a44b-fed640ad2c3a-kube-api-access-bh5k5\") pod \"community-operators-v7fjg\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:22 crc kubenswrapper[4907]: I0226 16:24:22.804571 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:23 crc kubenswrapper[4907]: I0226 16:24:23.452657 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v7fjg"] Feb 26 16:24:24 crc kubenswrapper[4907]: I0226 16:24:24.429733 4907 generic.go:334] "Generic (PLEG): container finished" podID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerID="b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40" exitCode=0 Feb 26 16:24:24 crc kubenswrapper[4907]: I0226 16:24:24.429808 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerDied","Data":"b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40"} Feb 26 16:24:24 crc kubenswrapper[4907]: I0226 16:24:24.430060 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerStarted","Data":"574baf9ec8c95ac62aeb507164423000fd00d83ff4a74b67ef6b9bc707fb8e97"} Feb 26 16:24:25 crc kubenswrapper[4907]: I0226 16:24:25.442765 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerStarted","Data":"b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4"} Feb 26 16:24:27 crc kubenswrapper[4907]: I0226 16:24:27.128015 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:24:27 crc kubenswrapper[4907]: E0226 16:24:27.128496 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:24:27 crc kubenswrapper[4907]: I0226 16:24:27.460814 4907 generic.go:334] "Generic (PLEG): container finished" podID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerID="b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4" exitCode=0 Feb 26 16:24:27 crc kubenswrapper[4907]: I0226 16:24:27.460854 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerDied","Data":"b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4"} Feb 26 16:24:28 crc kubenswrapper[4907]: I0226 16:24:28.472692 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerStarted","Data":"ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930"} Feb 26 16:24:28 crc kubenswrapper[4907]: I0226 16:24:28.497054 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v7fjg" podStartSLOduration=3.092552638 podStartE2EDuration="6.497038775s" podCreationTimestamp="2026-02-26 16:24:22 +0000 UTC" firstStartedPulling="2026-02-26 16:24:24.432992527 +0000 UTC m=+2526.951554376" lastFinishedPulling="2026-02-26 16:24:27.837478664 +0000 UTC m=+2530.356040513" observedRunningTime="2026-02-26 16:24:28.494696798 +0000 UTC m=+2531.013258637" watchObservedRunningTime="2026-02-26 16:24:28.497038775 +0000 UTC m=+2531.015600624" Feb 26 16:24:32 crc kubenswrapper[4907]: I0226 16:24:32.805476 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:32 crc kubenswrapper[4907]: I0226 16:24:32.806728 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:32 crc kubenswrapper[4907]: I0226 16:24:32.878043 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:33 crc kubenswrapper[4907]: I0226 16:24:33.558079 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:33 crc kubenswrapper[4907]: I0226 16:24:33.607241 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v7fjg"] Feb 26 16:24:35 crc kubenswrapper[4907]: I0226 16:24:35.528738 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v7fjg" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="registry-server" containerID="cri-o://ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930" gracePeriod=2 Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.089579 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.219789 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-catalog-content\") pod \"92508cb8-80be-49e0-a44b-fed640ad2c3a\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.220219 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-utilities\") pod \"92508cb8-80be-49e0-a44b-fed640ad2c3a\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.220286 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh5k5\" (UniqueName: \"kubernetes.io/projected/92508cb8-80be-49e0-a44b-fed640ad2c3a-kube-api-access-bh5k5\") pod \"92508cb8-80be-49e0-a44b-fed640ad2c3a\" (UID: \"92508cb8-80be-49e0-a44b-fed640ad2c3a\") " Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.221221 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-utilities" (OuterVolumeSpecName: "utilities") pod "92508cb8-80be-49e0-a44b-fed640ad2c3a" (UID: "92508cb8-80be-49e0-a44b-fed640ad2c3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.230047 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92508cb8-80be-49e0-a44b-fed640ad2c3a-kube-api-access-bh5k5" (OuterVolumeSpecName: "kube-api-access-bh5k5") pod "92508cb8-80be-49e0-a44b-fed640ad2c3a" (UID: "92508cb8-80be-49e0-a44b-fed640ad2c3a"). InnerVolumeSpecName "kube-api-access-bh5k5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.286803 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92508cb8-80be-49e0-a44b-fed640ad2c3a" (UID: "92508cb8-80be-49e0-a44b-fed640ad2c3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.323767 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.323800 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92508cb8-80be-49e0-a44b-fed640ad2c3a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.323811 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh5k5\" (UniqueName: \"kubernetes.io/projected/92508cb8-80be-49e0-a44b-fed640ad2c3a-kube-api-access-bh5k5\") on node \"crc\" DevicePath \"\"" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.538259 4907 generic.go:334] "Generic (PLEG): container finished" podID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerID="ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930" exitCode=0 Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.538317 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerDied","Data":"ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930"} Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.538343 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v7fjg" event={"ID":"92508cb8-80be-49e0-a44b-fed640ad2c3a","Type":"ContainerDied","Data":"574baf9ec8c95ac62aeb507164423000fd00d83ff4a74b67ef6b9bc707fb8e97"} Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.538359 4907 scope.go:117] "RemoveContainer" containerID="ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.538492 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v7fjg" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.565748 4907 scope.go:117] "RemoveContainer" containerID="b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.576436 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v7fjg"] Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.598744 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v7fjg"] Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.610673 4907 scope.go:117] "RemoveContainer" containerID="b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.639952 4907 scope.go:117] "RemoveContainer" containerID="ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930" Feb 26 16:24:36 crc kubenswrapper[4907]: E0226 16:24:36.640479 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930\": container with ID starting with ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930 not found: ID does not exist" containerID="ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.640510 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930"} err="failed to get container status \"ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930\": rpc error: code = NotFound desc = could not find container \"ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930\": container with ID starting with ad60b192ce1224b60172cf9529c06d5aa9ff79236283c2b09c0a60c8d5338930 not found: ID does not exist" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.640532 4907 scope.go:117] "RemoveContainer" containerID="b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4" Feb 26 16:24:36 crc kubenswrapper[4907]: E0226 16:24:36.644012 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4\": container with ID starting with b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4 not found: ID does not exist" containerID="b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.644035 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4"} err="failed to get container status \"b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4\": rpc error: code = NotFound desc = could not find container \"b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4\": container with ID starting with b7b32a3f8a32a7e25c74548d3f6010cc9650e9a7ff81f7a8ed9ccf9aa9f35be4 not found: ID does not exist" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.644049 4907 scope.go:117] "RemoveContainer" containerID="b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40" Feb 26 16:24:36 crc kubenswrapper[4907]: E0226 16:24:36.644435 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40\": container with ID starting with b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40 not found: ID does not exist" containerID="b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40" Feb 26 16:24:36 crc kubenswrapper[4907]: I0226 16:24:36.644466 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40"} err="failed to get container status \"b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40\": rpc error: code = NotFound desc = could not find container \"b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40\": container with ID starting with b539f6f11c7f9378b250e0d6200a8d5501bf272067408469f170c690ad8f4d40 not found: ID does not exist" Feb 26 16:24:38 crc kubenswrapper[4907]: I0226 16:24:38.137076 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" path="/var/lib/kubelet/pods/92508cb8-80be-49e0-a44b-fed640ad2c3a/volumes" Feb 26 16:24:41 crc kubenswrapper[4907]: I0226 16:24:41.127259 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:24:41 crc kubenswrapper[4907]: E0226 16:24:41.128698 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:24:56 crc kubenswrapper[4907]: I0226 16:24:56.126679 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:24:56 crc kubenswrapper[4907]: E0226 16:24:56.127331 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:25:11 crc kubenswrapper[4907]: I0226 16:25:11.127142 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:25:11 crc kubenswrapper[4907]: E0226 16:25:11.128096 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:25:23 crc kubenswrapper[4907]: I0226 16:25:23.957617 4907 generic.go:334] "Generic (PLEG): container finished" podID="2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" containerID="a27063a65a90ecc0458dac7209cca26e3f9fca666c2fb3a91dbb942ac1cea348" exitCode=0 Feb 26 16:25:23 crc kubenswrapper[4907]: I0226 16:25:23.957706 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" event={"ID":"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3","Type":"ContainerDied","Data":"a27063a65a90ecc0458dac7209cca26e3f9fca666c2fb3a91dbb942ac1cea348"} Feb 26 16:25:24 crc kubenswrapper[4907]: I0226 16:25:24.128400 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:25:24 crc kubenswrapper[4907]: I0226 16:25:24.967925 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"75b5efd2017cdc33ecaf179acb64b81d3ecdff3a0779fa753362a3be77de0f3d"} Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.408682 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.558468 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-combined-ca-bundle\") pod \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.560979 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-secret-0\") pod \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.561182 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-ssh-key-openstack-edpm-ipam\") pod \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.561228 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnk2p\" (UniqueName: \"kubernetes.io/projected/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-kube-api-access-gnk2p\") pod \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.562939 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-inventory\") pod \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\" (UID: \"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3\") " Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.566711 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" (UID: "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.568851 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-kube-api-access-gnk2p" (OuterVolumeSpecName: "kube-api-access-gnk2p") pod "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" (UID: "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3"). InnerVolumeSpecName "kube-api-access-gnk2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.616722 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" (UID: "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.620069 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" (UID: "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.627541 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-inventory" (OuterVolumeSpecName: "inventory") pod "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" (UID: "2ad5f1d0-06ec-4101-b484-d4e1bc3746a3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.665819 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.665852 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.665865 4907 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.665876 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.665889 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnk2p\" (UniqueName: \"kubernetes.io/projected/2ad5f1d0-06ec-4101-b484-d4e1bc3746a3-kube-api-access-gnk2p\") on node \"crc\" DevicePath \"\"" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.978678 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" event={"ID":"2ad5f1d0-06ec-4101-b484-d4e1bc3746a3","Type":"ContainerDied","Data":"5647f29d3302994ad1a03a4dce31c30736a3f933e6bdfb33ea95c55c62524c10"} Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.978721 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5647f29d3302994ad1a03a4dce31c30736a3f933e6bdfb33ea95c55c62524c10" Feb 26 16:25:25 crc kubenswrapper[4907]: I0226 16:25:25.978780 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.112440 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96"] Feb 26 16:25:26 crc kubenswrapper[4907]: E0226 16:25:26.112914 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="extract-utilities" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.112938 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="extract-utilities" Feb 26 16:25:26 crc kubenswrapper[4907]: E0226 16:25:26.112963 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.112971 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 16:25:26 crc kubenswrapper[4907]: E0226 16:25:26.112984 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="registry-server" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.112991 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="registry-server" Feb 26 16:25:26 crc kubenswrapper[4907]: E0226 16:25:26.113007 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="extract-content" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.113016 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="extract-content" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.113217 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ad5f1d0-06ec-4101-b484-d4e1bc3746a3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.113250 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="92508cb8-80be-49e0-a44b-fed640ad2c3a" containerName="registry-server" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.114094 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.118229 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.121978 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.122182 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.122230 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.122416 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.122579 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.124497 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.144744 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96"] Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.285629 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.285680 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.285701 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.286470 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.286541 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsvzm\" (UniqueName: \"kubernetes.io/projected/16415278-d48c-47a3-92b4-0dfb2da9c8ca-kube-api-access-qsvzm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.286576 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.286617 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.286954 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.287030 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.287071 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.287098 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388508 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388575 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsvzm\" (UniqueName: \"kubernetes.io/projected/16415278-d48c-47a3-92b4-0dfb2da9c8ca-kube-api-access-qsvzm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388619 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388642 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388709 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388743 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388797 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388813 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388830 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.388849 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.389467 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.392977 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.393400 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.394995 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.416288 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.419286 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.429347 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.434851 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.434956 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.437376 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.441623 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsvzm\" (UniqueName: \"kubernetes.io/projected/16415278-d48c-47a3-92b4-0dfb2da9c8ca-kube-api-access-qsvzm\") pod \"nova-edpm-deployment-openstack-edpm-ipam-klh96\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.442652 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.755501 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96"] Feb 26 16:25:26 crc kubenswrapper[4907]: W0226 16:25:26.764397 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16415278_d48c_47a3_92b4_0dfb2da9c8ca.slice/crio-63e4692fd5439e025636fa0ada42189c75a7ca61525e3773693fd092154a09a6 WatchSource:0}: Error finding container 63e4692fd5439e025636fa0ada42189c75a7ca61525e3773693fd092154a09a6: Status 404 returned error can't find the container with id 63e4692fd5439e025636fa0ada42189c75a7ca61525e3773693fd092154a09a6 Feb 26 16:25:26 crc kubenswrapper[4907]: I0226 16:25:26.988426 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" event={"ID":"16415278-d48c-47a3-92b4-0dfb2da9c8ca","Type":"ContainerStarted","Data":"63e4692fd5439e025636fa0ada42189c75a7ca61525e3773693fd092154a09a6"} Feb 26 16:25:28 crc kubenswrapper[4907]: I0226 16:25:28.002527 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" event={"ID":"16415278-d48c-47a3-92b4-0dfb2da9c8ca","Type":"ContainerStarted","Data":"e85633773c7b1531186ce71112d4261149d7453b8792f4f7c706e488b4d2b1bb"} Feb 26 16:25:28 crc kubenswrapper[4907]: I0226 16:25:28.026394 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" podStartSLOduration=1.5183036859999999 podStartE2EDuration="2.026375055s" podCreationTimestamp="2026-02-26 16:25:26 +0000 UTC" firstStartedPulling="2026-02-26 16:25:26.766949176 +0000 UTC m=+2589.285511025" lastFinishedPulling="2026-02-26 16:25:27.275020525 +0000 UTC m=+2589.793582394" observedRunningTime="2026-02-26 16:25:28.022016299 +0000 UTC m=+2590.540578148" watchObservedRunningTime="2026-02-26 16:25:28.026375055 +0000 UTC m=+2590.544936914" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.189176 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535386-bjv9s"] Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.190830 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.200605 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.200686 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.200980 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.215020 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-bjv9s"] Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.357158 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w78v2\" (UniqueName: \"kubernetes.io/projected/34f41544-9fae-45ec-9e99-60598164470b-kube-api-access-w78v2\") pod \"auto-csr-approver-29535386-bjv9s\" (UID: \"34f41544-9fae-45ec-9e99-60598164470b\") " pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.459987 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w78v2\" (UniqueName: \"kubernetes.io/projected/34f41544-9fae-45ec-9e99-60598164470b-kube-api-access-w78v2\") pod \"auto-csr-approver-29535386-bjv9s\" (UID: \"34f41544-9fae-45ec-9e99-60598164470b\") " pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.480578 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w78v2\" (UniqueName: \"kubernetes.io/projected/34f41544-9fae-45ec-9e99-60598164470b-kube-api-access-w78v2\") pod \"auto-csr-approver-29535386-bjv9s\" (UID: \"34f41544-9fae-45ec-9e99-60598164470b\") " pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:00 crc kubenswrapper[4907]: I0226 16:26:00.527008 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:01 crc kubenswrapper[4907]: I0226 16:26:01.042750 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-bjv9s"] Feb 26 16:26:01 crc kubenswrapper[4907]: I0226 16:26:01.308433 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" event={"ID":"34f41544-9fae-45ec-9e99-60598164470b","Type":"ContainerStarted","Data":"3f2857bb81066b6e66567d8c4cb1feb1ff26e4e14472018be97cac407dafed0c"} Feb 26 16:26:02 crc kubenswrapper[4907]: I0226 16:26:02.317413 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" event={"ID":"34f41544-9fae-45ec-9e99-60598164470b","Type":"ContainerStarted","Data":"baecd666444d528873510e3392a5af31aee1d1f70e657bff9ab36085882db7b3"} Feb 26 16:26:02 crc kubenswrapper[4907]: I0226 16:26:02.344628 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" podStartSLOduration=1.414259709 podStartE2EDuration="2.344607574s" podCreationTimestamp="2026-02-26 16:26:00 +0000 UTC" firstStartedPulling="2026-02-26 16:26:01.052869324 +0000 UTC m=+2623.571431173" lastFinishedPulling="2026-02-26 16:26:01.983217189 +0000 UTC m=+2624.501779038" observedRunningTime="2026-02-26 16:26:02.336197697 +0000 UTC m=+2624.854759546" watchObservedRunningTime="2026-02-26 16:26:02.344607574 +0000 UTC m=+2624.863169423" Feb 26 16:26:03 crc kubenswrapper[4907]: I0226 16:26:03.329985 4907 generic.go:334] "Generic (PLEG): container finished" podID="34f41544-9fae-45ec-9e99-60598164470b" containerID="baecd666444d528873510e3392a5af31aee1d1f70e657bff9ab36085882db7b3" exitCode=0 Feb 26 16:26:03 crc kubenswrapper[4907]: I0226 16:26:03.330075 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" event={"ID":"34f41544-9fae-45ec-9e99-60598164470b","Type":"ContainerDied","Data":"baecd666444d528873510e3392a5af31aee1d1f70e657bff9ab36085882db7b3"} Feb 26 16:26:04 crc kubenswrapper[4907]: I0226 16:26:04.696739 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:04 crc kubenswrapper[4907]: I0226 16:26:04.777745 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w78v2\" (UniqueName: \"kubernetes.io/projected/34f41544-9fae-45ec-9e99-60598164470b-kube-api-access-w78v2\") pod \"34f41544-9fae-45ec-9e99-60598164470b\" (UID: \"34f41544-9fae-45ec-9e99-60598164470b\") " Feb 26 16:26:04 crc kubenswrapper[4907]: I0226 16:26:04.798185 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f41544-9fae-45ec-9e99-60598164470b-kube-api-access-w78v2" (OuterVolumeSpecName: "kube-api-access-w78v2") pod "34f41544-9fae-45ec-9e99-60598164470b" (UID: "34f41544-9fae-45ec-9e99-60598164470b"). InnerVolumeSpecName "kube-api-access-w78v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:26:04 crc kubenswrapper[4907]: I0226 16:26:04.881266 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w78v2\" (UniqueName: \"kubernetes.io/projected/34f41544-9fae-45ec-9e99-60598164470b-kube-api-access-w78v2\") on node \"crc\" DevicePath \"\"" Feb 26 16:26:05 crc kubenswrapper[4907]: I0226 16:26:05.349920 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" event={"ID":"34f41544-9fae-45ec-9e99-60598164470b","Type":"ContainerDied","Data":"3f2857bb81066b6e66567d8c4cb1feb1ff26e4e14472018be97cac407dafed0c"} Feb 26 16:26:05 crc kubenswrapper[4907]: I0226 16:26:05.350261 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f2857bb81066b6e66567d8c4cb1feb1ff26e4e14472018be97cac407dafed0c" Feb 26 16:26:05 crc kubenswrapper[4907]: I0226 16:26:05.350026 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535386-bjv9s" Feb 26 16:26:05 crc kubenswrapper[4907]: I0226 16:26:05.413392 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-n7f96"] Feb 26 16:26:05 crc kubenswrapper[4907]: I0226 16:26:05.422605 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535380-n7f96"] Feb 26 16:26:06 crc kubenswrapper[4907]: I0226 16:26:06.140966 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e8ff52-ea04-4294-a6bb-4ec86d328fd3" path="/var/lib/kubelet/pods/d1e8ff52-ea04-4294-a6bb-4ec86d328fd3/volumes" Feb 26 16:26:11 crc kubenswrapper[4907]: I0226 16:26:11.087277 4907 scope.go:117] "RemoveContainer" containerID="e0212c7e07f7d8cb33491c0404568acab4c95f21e7f235ee87f59f2567c6f7ba" Feb 26 16:27:48 crc kubenswrapper[4907]: I0226 16:27:48.529965 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:27:48 crc kubenswrapper[4907]: I0226 16:27:48.530505 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.148361 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535388-7pf82"] Feb 26 16:28:00 crc kubenswrapper[4907]: E0226 16:28:00.149258 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f41544-9fae-45ec-9e99-60598164470b" containerName="oc" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.149271 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f41544-9fae-45ec-9e99-60598164470b" containerName="oc" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.149485 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f41544-9fae-45ec-9e99-60598164470b" containerName="oc" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.150054 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.153973 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.154127 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.154306 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.166271 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-7pf82"] Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.315232 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459bq\" (UniqueName: \"kubernetes.io/projected/9755fa96-5b0e-4088-88ec-70ec3bb6121d-kube-api-access-459bq\") pod \"auto-csr-approver-29535388-7pf82\" (UID: \"9755fa96-5b0e-4088-88ec-70ec3bb6121d\") " pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.417136 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-459bq\" (UniqueName: \"kubernetes.io/projected/9755fa96-5b0e-4088-88ec-70ec3bb6121d-kube-api-access-459bq\") pod \"auto-csr-approver-29535388-7pf82\" (UID: \"9755fa96-5b0e-4088-88ec-70ec3bb6121d\") " pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.439385 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-459bq\" (UniqueName: \"kubernetes.io/projected/9755fa96-5b0e-4088-88ec-70ec3bb6121d-kube-api-access-459bq\") pod \"auto-csr-approver-29535388-7pf82\" (UID: \"9755fa96-5b0e-4088-88ec-70ec3bb6121d\") " pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:00 crc kubenswrapper[4907]: I0226 16:28:00.473323 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:01 crc kubenswrapper[4907]: I0226 16:28:01.003635 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-7pf82"] Feb 26 16:28:01 crc kubenswrapper[4907]: I0226 16:28:01.383985 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-7pf82" event={"ID":"9755fa96-5b0e-4088-88ec-70ec3bb6121d","Type":"ContainerStarted","Data":"6728030b075e52a7a8723d3495a5d892b2ea23e54bee76b0d8623f653c0d4db7"} Feb 26 16:28:02 crc kubenswrapper[4907]: I0226 16:28:02.411161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-7pf82" event={"ID":"9755fa96-5b0e-4088-88ec-70ec3bb6121d","Type":"ContainerStarted","Data":"a54196bad514725e5c6f92ebdd204cee3fe70e1b81698b178c2679a7877e398a"} Feb 26 16:28:02 crc kubenswrapper[4907]: I0226 16:28:02.432503 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535388-7pf82" podStartSLOduration=1.5331429 podStartE2EDuration="2.432478001s" podCreationTimestamp="2026-02-26 16:28:00 +0000 UTC" firstStartedPulling="2026-02-26 16:28:01.017723257 +0000 UTC m=+2743.536285106" lastFinishedPulling="2026-02-26 16:28:01.917058358 +0000 UTC m=+2744.435620207" observedRunningTime="2026-02-26 16:28:02.424887785 +0000 UTC m=+2744.943449634" watchObservedRunningTime="2026-02-26 16:28:02.432478001 +0000 UTC m=+2744.951039850" Feb 26 16:28:03 crc kubenswrapper[4907]: I0226 16:28:03.424505 4907 generic.go:334] "Generic (PLEG): container finished" podID="16415278-d48c-47a3-92b4-0dfb2da9c8ca" containerID="e85633773c7b1531186ce71112d4261149d7453b8792f4f7c706e488b4d2b1bb" exitCode=0 Feb 26 16:28:03 crc kubenswrapper[4907]: I0226 16:28:03.424579 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" event={"ID":"16415278-d48c-47a3-92b4-0dfb2da9c8ca","Type":"ContainerDied","Data":"e85633773c7b1531186ce71112d4261149d7453b8792f4f7c706e488b4d2b1bb"} Feb 26 16:28:03 crc kubenswrapper[4907]: I0226 16:28:03.428183 4907 generic.go:334] "Generic (PLEG): container finished" podID="9755fa96-5b0e-4088-88ec-70ec3bb6121d" containerID="a54196bad514725e5c6f92ebdd204cee3fe70e1b81698b178c2679a7877e398a" exitCode=0 Feb 26 16:28:03 crc kubenswrapper[4907]: I0226 16:28:03.428226 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-7pf82" event={"ID":"9755fa96-5b0e-4088-88ec-70ec3bb6121d","Type":"ContainerDied","Data":"a54196bad514725e5c6f92ebdd204cee3fe70e1b81698b178c2679a7877e398a"} Feb 26 16:28:04 crc kubenswrapper[4907]: I0226 16:28:04.903252 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:04 crc kubenswrapper[4907]: I0226 16:28:04.911229 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.021417 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-1\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.021739 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsvzm\" (UniqueName: \"kubernetes.io/projected/16415278-d48c-47a3-92b4-0dfb2da9c8ca-kube-api-access-qsvzm\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.021890 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-0\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.021994 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-ssh-key-openstack-edpm-ipam\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022117 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-inventory\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022210 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-3\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022285 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-459bq\" (UniqueName: \"kubernetes.io/projected/9755fa96-5b0e-4088-88ec-70ec3bb6121d-kube-api-access-459bq\") pod \"9755fa96-5b0e-4088-88ec-70ec3bb6121d\" (UID: \"9755fa96-5b0e-4088-88ec-70ec3bb6121d\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022363 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-combined-ca-bundle\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022520 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-extra-config-0\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022671 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-0\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022751 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-1\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.022829 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-2\") pod \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\" (UID: \"16415278-d48c-47a3-92b4-0dfb2da9c8ca\") " Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.040443 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16415278-d48c-47a3-92b4-0dfb2da9c8ca-kube-api-access-qsvzm" (OuterVolumeSpecName: "kube-api-access-qsvzm") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "kube-api-access-qsvzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.047826 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.056896 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9755fa96-5b0e-4088-88ec-70ec3bb6121d-kube-api-access-459bq" (OuterVolumeSpecName: "kube-api-access-459bq") pod "9755fa96-5b0e-4088-88ec-70ec3bb6121d" (UID: "9755fa96-5b0e-4088-88ec-70ec3bb6121d"). InnerVolumeSpecName "kube-api-access-459bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.060316 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.063184 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.064411 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.066927 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.079409 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-inventory" (OuterVolumeSpecName: "inventory") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.083771 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.089839 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.097305 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.097863 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "16415278-d48c-47a3-92b4-0dfb2da9c8ca" (UID: "16415278-d48c-47a3-92b4-0dfb2da9c8ca"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124695 4907 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124728 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124738 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124747 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124756 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-459bq\" (UniqueName: \"kubernetes.io/projected/9755fa96-5b0e-4088-88ec-70ec3bb6121d-kube-api-access-459bq\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124764 4907 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124774 4907 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124783 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124791 4907 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124799 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124806 4907 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/16415278-d48c-47a3-92b4-0dfb2da9c8ca-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.124816 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsvzm\" (UniqueName: \"kubernetes.io/projected/16415278-d48c-47a3-92b4-0dfb2da9c8ca-kube-api-access-qsvzm\") on node \"crc\" DevicePath \"\"" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.449542 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" event={"ID":"16415278-d48c-47a3-92b4-0dfb2da9c8ca","Type":"ContainerDied","Data":"63e4692fd5439e025636fa0ada42189c75a7ca61525e3773693fd092154a09a6"} Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.449838 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63e4692fd5439e025636fa0ada42189c75a7ca61525e3773693fd092154a09a6" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.449638 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-klh96" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.451367 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535388-7pf82" event={"ID":"9755fa96-5b0e-4088-88ec-70ec3bb6121d","Type":"ContainerDied","Data":"6728030b075e52a7a8723d3495a5d892b2ea23e54bee76b0d8623f653c0d4db7"} Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.451489 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6728030b075e52a7a8723d3495a5d892b2ea23e54bee76b0d8623f653c0d4db7" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.451663 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535388-7pf82" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.541676 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-2bh55"] Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.551885 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535382-2bh55"] Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.607930 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2"] Feb 26 16:28:05 crc kubenswrapper[4907]: E0226 16:28:05.608388 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16415278-d48c-47a3-92b4-0dfb2da9c8ca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.608411 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="16415278-d48c-47a3-92b4-0dfb2da9c8ca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 16:28:05 crc kubenswrapper[4907]: E0226 16:28:05.608426 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9755fa96-5b0e-4088-88ec-70ec3bb6121d" containerName="oc" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.608433 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9755fa96-5b0e-4088-88ec-70ec3bb6121d" containerName="oc" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.608698 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9755fa96-5b0e-4088-88ec-70ec3bb6121d" containerName="oc" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.608721 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="16415278-d48c-47a3-92b4-0dfb2da9c8ca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.609475 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.613998 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.614024 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.614538 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-57jxc" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.614745 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.614946 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.622945 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2"] Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.735688 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvnt7\" (UniqueName: \"kubernetes.io/projected/2483c310-db88-4757-857d-91e2815bbe67-kube-api-access-rvnt7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.735772 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.735807 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.735836 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.735916 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.735941 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.736253 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838015 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvnt7\" (UniqueName: \"kubernetes.io/projected/2483c310-db88-4757-857d-91e2815bbe67-kube-api-access-rvnt7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838111 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838177 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838200 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838291 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.838364 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.842813 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.842929 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.843233 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.843995 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.848210 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.851332 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.857165 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvnt7\" (UniqueName: \"kubernetes.io/projected/2483c310-db88-4757-857d-91e2815bbe67-kube-api-access-rvnt7\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:05 crc kubenswrapper[4907]: I0226 16:28:05.933580 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:28:06 crc kubenswrapper[4907]: I0226 16:28:06.140378 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d" path="/var/lib/kubelet/pods/838f8fc1-6bdf-4fce-ab55-0da69d8f9d7d/volumes" Feb 26 16:28:06 crc kubenswrapper[4907]: I0226 16:28:06.483065 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2"] Feb 26 16:28:06 crc kubenswrapper[4907]: W0226 16:28:06.490141 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2483c310_db88_4757_857d_91e2815bbe67.slice/crio-0b00de7d2871974b72b0c4e4ea8b859344cc991f6f3136d4d7a2e5c8b358dd7d WatchSource:0}: Error finding container 0b00de7d2871974b72b0c4e4ea8b859344cc991f6f3136d4d7a2e5c8b358dd7d: Status 404 returned error can't find the container with id 0b00de7d2871974b72b0c4e4ea8b859344cc991f6f3136d4d7a2e5c8b358dd7d Feb 26 16:28:07 crc kubenswrapper[4907]: I0226 16:28:07.469797 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" event={"ID":"2483c310-db88-4757-857d-91e2815bbe67","Type":"ContainerStarted","Data":"01a3caffe83da211057576d6a9486c51b6734df195fcf6c515571dd452c2693a"} Feb 26 16:28:07 crc kubenswrapper[4907]: I0226 16:28:07.471585 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" event={"ID":"2483c310-db88-4757-857d-91e2815bbe67","Type":"ContainerStarted","Data":"0b00de7d2871974b72b0c4e4ea8b859344cc991f6f3136d4d7a2e5c8b358dd7d"} Feb 26 16:28:07 crc kubenswrapper[4907]: I0226 16:28:07.505036 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" podStartSLOduration=2.011435284 podStartE2EDuration="2.504984183s" podCreationTimestamp="2026-02-26 16:28:05 +0000 UTC" firstStartedPulling="2026-02-26 16:28:06.49310376 +0000 UTC m=+2749.011665609" lastFinishedPulling="2026-02-26 16:28:06.986652599 +0000 UTC m=+2749.505214508" observedRunningTime="2026-02-26 16:28:07.489336731 +0000 UTC m=+2750.007898610" watchObservedRunningTime="2026-02-26 16:28:07.504984183 +0000 UTC m=+2750.023546062" Feb 26 16:28:11 crc kubenswrapper[4907]: I0226 16:28:11.185671 4907 scope.go:117] "RemoveContainer" containerID="015d8fbbf88748c71dfab4af205f314f429c130435d235bff9838b633676221f" Feb 26 16:28:18 crc kubenswrapper[4907]: I0226 16:28:18.530176 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:28:18 crc kubenswrapper[4907]: I0226 16:28:18.530804 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.530240 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.530850 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.530910 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.531890 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75b5efd2017cdc33ecaf179acb64b81d3ecdff3a0779fa753362a3be77de0f3d"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.531964 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://75b5efd2017cdc33ecaf179acb64b81d3ecdff3a0779fa753362a3be77de0f3d" gracePeriod=600 Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.848299 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"75b5efd2017cdc33ecaf179acb64b81d3ecdff3a0779fa753362a3be77de0f3d"} Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.849148 4907 scope.go:117] "RemoveContainer" containerID="559a23ccedd7d10bc357288f1f6efb2cbce2fbfb3e7c59e80318d9e7c716e085" Feb 26 16:28:48 crc kubenswrapper[4907]: I0226 16:28:48.848251 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="75b5efd2017cdc33ecaf179acb64b81d3ecdff3a0779fa753362a3be77de0f3d" exitCode=0 Feb 26 16:28:49 crc kubenswrapper[4907]: I0226 16:28:49.861185 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd"} Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.209851 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535390-vhjq2"] Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.212037 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.221020 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.221061 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.221102 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.226573 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm"] Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.229629 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.238159 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.238227 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.239326 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-vhjq2"] Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.252694 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm"] Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.266333 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1c526eb-200e-4782-86fa-62ed06c130f2-config-volume\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.266420 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1c526eb-200e-4782-86fa-62ed06c130f2-secret-volume\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.266450 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlfcq\" (UniqueName: \"kubernetes.io/projected/2e01c029-98ef-4ec1-a3ae-4697f2276293-kube-api-access-mlfcq\") pod \"auto-csr-approver-29535390-vhjq2\" (UID: \"2e01c029-98ef-4ec1-a3ae-4697f2276293\") " pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.266562 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znzjb\" (UniqueName: \"kubernetes.io/projected/c1c526eb-200e-4782-86fa-62ed06c130f2-kube-api-access-znzjb\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.368408 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1c526eb-200e-4782-86fa-62ed06c130f2-secret-volume\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.368471 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlfcq\" (UniqueName: \"kubernetes.io/projected/2e01c029-98ef-4ec1-a3ae-4697f2276293-kube-api-access-mlfcq\") pod \"auto-csr-approver-29535390-vhjq2\" (UID: \"2e01c029-98ef-4ec1-a3ae-4697f2276293\") " pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.368670 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znzjb\" (UniqueName: \"kubernetes.io/projected/c1c526eb-200e-4782-86fa-62ed06c130f2-kube-api-access-znzjb\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.368744 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1c526eb-200e-4782-86fa-62ed06c130f2-config-volume\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.369697 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1c526eb-200e-4782-86fa-62ed06c130f2-config-volume\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.377231 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1c526eb-200e-4782-86fa-62ed06c130f2-secret-volume\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.397273 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlfcq\" (UniqueName: \"kubernetes.io/projected/2e01c029-98ef-4ec1-a3ae-4697f2276293-kube-api-access-mlfcq\") pod \"auto-csr-approver-29535390-vhjq2\" (UID: \"2e01c029-98ef-4ec1-a3ae-4697f2276293\") " pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.399770 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znzjb\" (UniqueName: \"kubernetes.io/projected/c1c526eb-200e-4782-86fa-62ed06c130f2-kube-api-access-znzjb\") pod \"collect-profiles-29535390-btljm\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.540390 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:00 crc kubenswrapper[4907]: I0226 16:30:00.567421 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.045032 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-vhjq2"] Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.045978 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.166297 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm"] Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.490120 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" event={"ID":"c1c526eb-200e-4782-86fa-62ed06c130f2","Type":"ContainerStarted","Data":"fc78faa38a74988e253a8611cb0399266d96eae2bf896c092080e5c035152780"} Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.490164 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" event={"ID":"c1c526eb-200e-4782-86fa-62ed06c130f2","Type":"ContainerStarted","Data":"dc3c82d81b9fe57f81e1144e3c9c2e529161b4c8e70432fb5839505f10e5ff17"} Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.493953 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" event={"ID":"2e01c029-98ef-4ec1-a3ae-4697f2276293","Type":"ContainerStarted","Data":"8befbb88a38a19c507753cebee7f7263a19b3f08c247283933baf332fe9ee5cd"} Feb 26 16:30:01 crc kubenswrapper[4907]: I0226 16:30:01.511860 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" podStartSLOduration=1.5118436339999999 podStartE2EDuration="1.511843634s" podCreationTimestamp="2026-02-26 16:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 16:30:01.508368059 +0000 UTC m=+2864.026929908" watchObservedRunningTime="2026-02-26 16:30:01.511843634 +0000 UTC m=+2864.030405483" Feb 26 16:30:02 crc kubenswrapper[4907]: I0226 16:30:02.505518 4907 generic.go:334] "Generic (PLEG): container finished" podID="c1c526eb-200e-4782-86fa-62ed06c130f2" containerID="fc78faa38a74988e253a8611cb0399266d96eae2bf896c092080e5c035152780" exitCode=0 Feb 26 16:30:02 crc kubenswrapper[4907]: I0226 16:30:02.505867 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" event={"ID":"c1c526eb-200e-4782-86fa-62ed06c130f2","Type":"ContainerDied","Data":"fc78faa38a74988e253a8611cb0399266d96eae2bf896c092080e5c035152780"} Feb 26 16:30:03 crc kubenswrapper[4907]: I0226 16:30:03.524953 4907 generic.go:334] "Generic (PLEG): container finished" podID="2e01c029-98ef-4ec1-a3ae-4697f2276293" containerID="d70a830fb10dff28153179da02012a119d15c585609baa0a010ee82fb7ec527c" exitCode=0 Feb 26 16:30:03 crc kubenswrapper[4907]: I0226 16:30:03.525444 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" event={"ID":"2e01c029-98ef-4ec1-a3ae-4697f2276293","Type":"ContainerDied","Data":"d70a830fb10dff28153179da02012a119d15c585609baa0a010ee82fb7ec527c"} Feb 26 16:30:03 crc kubenswrapper[4907]: I0226 16:30:03.895180 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.033697 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znzjb\" (UniqueName: \"kubernetes.io/projected/c1c526eb-200e-4782-86fa-62ed06c130f2-kube-api-access-znzjb\") pod \"c1c526eb-200e-4782-86fa-62ed06c130f2\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.034038 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1c526eb-200e-4782-86fa-62ed06c130f2-secret-volume\") pod \"c1c526eb-200e-4782-86fa-62ed06c130f2\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.034181 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1c526eb-200e-4782-86fa-62ed06c130f2-config-volume\") pod \"c1c526eb-200e-4782-86fa-62ed06c130f2\" (UID: \"c1c526eb-200e-4782-86fa-62ed06c130f2\") " Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.034897 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1c526eb-200e-4782-86fa-62ed06c130f2-config-volume" (OuterVolumeSpecName: "config-volume") pod "c1c526eb-200e-4782-86fa-62ed06c130f2" (UID: "c1c526eb-200e-4782-86fa-62ed06c130f2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.041444 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c526eb-200e-4782-86fa-62ed06c130f2-kube-api-access-znzjb" (OuterVolumeSpecName: "kube-api-access-znzjb") pod "c1c526eb-200e-4782-86fa-62ed06c130f2" (UID: "c1c526eb-200e-4782-86fa-62ed06c130f2"). InnerVolumeSpecName "kube-api-access-znzjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.042812 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c526eb-200e-4782-86fa-62ed06c130f2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c1c526eb-200e-4782-86fa-62ed06c130f2" (UID: "c1c526eb-200e-4782-86fa-62ed06c130f2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.143892 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znzjb\" (UniqueName: \"kubernetes.io/projected/c1c526eb-200e-4782-86fa-62ed06c130f2-kube-api-access-znzjb\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.143928 4907 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c1c526eb-200e-4782-86fa-62ed06c130f2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.143940 4907 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c1c526eb-200e-4782-86fa-62ed06c130f2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.230889 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88"] Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.244875 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535345-b7r88"] Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.554697 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.555631 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535390-btljm" event={"ID":"c1c526eb-200e-4782-86fa-62ed06c130f2","Type":"ContainerDied","Data":"dc3c82d81b9fe57f81e1144e3c9c2e529161b4c8e70432fb5839505f10e5ff17"} Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.557535 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc3c82d81b9fe57f81e1144e3c9c2e529161b4c8e70432fb5839505f10e5ff17" Feb 26 16:30:04 crc kubenswrapper[4907]: I0226 16:30:04.902649 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.059272 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlfcq\" (UniqueName: \"kubernetes.io/projected/2e01c029-98ef-4ec1-a3ae-4697f2276293-kube-api-access-mlfcq\") pod \"2e01c029-98ef-4ec1-a3ae-4697f2276293\" (UID: \"2e01c029-98ef-4ec1-a3ae-4697f2276293\") " Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.066788 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e01c029-98ef-4ec1-a3ae-4697f2276293-kube-api-access-mlfcq" (OuterVolumeSpecName: "kube-api-access-mlfcq") pod "2e01c029-98ef-4ec1-a3ae-4697f2276293" (UID: "2e01c029-98ef-4ec1-a3ae-4697f2276293"). InnerVolumeSpecName "kube-api-access-mlfcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.161886 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlfcq\" (UniqueName: \"kubernetes.io/projected/2e01c029-98ef-4ec1-a3ae-4697f2276293-kube-api-access-mlfcq\") on node \"crc\" DevicePath \"\"" Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.569564 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" event={"ID":"2e01c029-98ef-4ec1-a3ae-4697f2276293","Type":"ContainerDied","Data":"8befbb88a38a19c507753cebee7f7263a19b3f08c247283933baf332fe9ee5cd"} Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.569622 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8befbb88a38a19c507753cebee7f7263a19b3f08c247283933baf332fe9ee5cd" Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.569724 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535390-vhjq2" Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.973168 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-lhdqq"] Feb 26 16:30:05 crc kubenswrapper[4907]: I0226 16:30:05.984400 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535384-lhdqq"] Feb 26 16:30:06 crc kubenswrapper[4907]: I0226 16:30:06.142030 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd74211-40c2-437c-9295-b69e709f81fe" path="/var/lib/kubelet/pods/0dd74211-40c2-437c-9295-b69e709f81fe/volumes" Feb 26 16:30:06 crc kubenswrapper[4907]: I0226 16:30:06.143426 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21" path="/var/lib/kubelet/pods/5fb2d7c1-1737-4c4c-8c42-fcf0bf406f21/volumes" Feb 26 16:30:11 crc kubenswrapper[4907]: I0226 16:30:11.296836 4907 scope.go:117] "RemoveContainer" containerID="80a64ed61aa8a30637a58770379e6089a6245b33d99252bde06feca920721411" Feb 26 16:30:11 crc kubenswrapper[4907]: I0226 16:30:11.356014 4907 scope.go:117] "RemoveContainer" containerID="dbb0a17c19b0ecd0029d1ab15137ff5e45d1ec47832ed90912f8f2b1f23fb7d1" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.597179 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wp69c"] Feb 26 16:30:41 crc kubenswrapper[4907]: E0226 16:30:41.598288 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c526eb-200e-4782-86fa-62ed06c130f2" containerName="collect-profiles" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.598307 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c526eb-200e-4782-86fa-62ed06c130f2" containerName="collect-profiles" Feb 26 16:30:41 crc kubenswrapper[4907]: E0226 16:30:41.598323 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e01c029-98ef-4ec1-a3ae-4697f2276293" containerName="oc" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.598331 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e01c029-98ef-4ec1-a3ae-4697f2276293" containerName="oc" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.598609 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c526eb-200e-4782-86fa-62ed06c130f2" containerName="collect-profiles" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.598636 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e01c029-98ef-4ec1-a3ae-4697f2276293" containerName="oc" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.600233 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.621053 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wp69c"] Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.704169 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-utilities\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.704400 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-catalog-content\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.704758 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84vsj\" (UniqueName: \"kubernetes.io/projected/6c04acb1-885d-4771-8664-63da44063a01-kube-api-access-84vsj\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.806483 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-utilities\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.806544 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-catalog-content\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.806647 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84vsj\" (UniqueName: \"kubernetes.io/projected/6c04acb1-885d-4771-8664-63da44063a01-kube-api-access-84vsj\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.807112 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-catalog-content\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.807283 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-utilities\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.831229 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84vsj\" (UniqueName: \"kubernetes.io/projected/6c04acb1-885d-4771-8664-63da44063a01-kube-api-access-84vsj\") pod \"redhat-operators-wp69c\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:41 crc kubenswrapper[4907]: I0226 16:30:41.924091 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:42 crc kubenswrapper[4907]: I0226 16:30:42.461931 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wp69c"] Feb 26 16:30:42 crc kubenswrapper[4907]: I0226 16:30:42.910858 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c04acb1-885d-4771-8664-63da44063a01" containerID="3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a" exitCode=0 Feb 26 16:30:42 crc kubenswrapper[4907]: I0226 16:30:42.911140 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerDied","Data":"3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a"} Feb 26 16:30:42 crc kubenswrapper[4907]: I0226 16:30:42.911165 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerStarted","Data":"8957b8c5ef70e79de063883da563c147a782172b5d054dfb667bad43c8ddc26c"} Feb 26 16:30:43 crc kubenswrapper[4907]: I0226 16:30:43.924107 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerStarted","Data":"ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c"} Feb 26 16:30:47 crc kubenswrapper[4907]: I0226 16:30:47.971552 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z82rd"] Feb 26 16:30:47 crc kubenswrapper[4907]: I0226 16:30:47.975486 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.003850 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z82rd"] Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.146926 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-catalog-content\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.147029 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-utilities\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.147115 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvn2\" (UniqueName: \"kubernetes.io/projected/2902850b-3996-48a4-b518-206ea71c487f-kube-api-access-5nvn2\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.249273 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-utilities\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.249442 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvn2\" (UniqueName: \"kubernetes.io/projected/2902850b-3996-48a4-b518-206ea71c487f-kube-api-access-5nvn2\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.249641 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-catalog-content\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.250676 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-catalog-content\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.250723 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-utilities\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.268799 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvn2\" (UniqueName: \"kubernetes.io/projected/2902850b-3996-48a4-b518-206ea71c487f-kube-api-access-5nvn2\") pod \"redhat-marketplace-z82rd\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.305267 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.531348 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.531625 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.857529 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z82rd"] Feb 26 16:30:48 crc kubenswrapper[4907]: I0226 16:30:48.994912 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerStarted","Data":"445694260dac7d0cb8b52025bd7455d283bd9f91d419902562b265f0f3db88fc"} Feb 26 16:30:50 crc kubenswrapper[4907]: I0226 16:30:50.006442 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c04acb1-885d-4771-8664-63da44063a01" containerID="ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c" exitCode=0 Feb 26 16:30:50 crc kubenswrapper[4907]: I0226 16:30:50.006485 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerDied","Data":"ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c"} Feb 26 16:30:50 crc kubenswrapper[4907]: I0226 16:30:50.008536 4907 generic.go:334] "Generic (PLEG): container finished" podID="2902850b-3996-48a4-b518-206ea71c487f" containerID="6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2" exitCode=0 Feb 26 16:30:50 crc kubenswrapper[4907]: I0226 16:30:50.008572 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerDied","Data":"6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2"} Feb 26 16:30:51 crc kubenswrapper[4907]: I0226 16:30:51.021689 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerStarted","Data":"5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00"} Feb 26 16:30:51 crc kubenswrapper[4907]: I0226 16:30:51.025224 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerStarted","Data":"fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1"} Feb 26 16:30:51 crc kubenswrapper[4907]: I0226 16:30:51.058692 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wp69c" podStartSLOduration=2.571228623 podStartE2EDuration="10.058662775s" podCreationTimestamp="2026-02-26 16:30:41 +0000 UTC" firstStartedPulling="2026-02-26 16:30:42.912939217 +0000 UTC m=+2905.431501066" lastFinishedPulling="2026-02-26 16:30:50.400373369 +0000 UTC m=+2912.918935218" observedRunningTime="2026-02-26 16:30:51.054221477 +0000 UTC m=+2913.572783326" watchObservedRunningTime="2026-02-26 16:30:51.058662775 +0000 UTC m=+2913.577224654" Feb 26 16:30:51 crc kubenswrapper[4907]: I0226 16:30:51.924285 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:51 crc kubenswrapper[4907]: I0226 16:30:51.924331 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:30:52 crc kubenswrapper[4907]: I0226 16:30:52.972096 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wp69c" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" probeResult="failure" output=< Feb 26 16:30:52 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:30:52 crc kubenswrapper[4907]: > Feb 26 16:30:53 crc kubenswrapper[4907]: I0226 16:30:53.052801 4907 generic.go:334] "Generic (PLEG): container finished" podID="2902850b-3996-48a4-b518-206ea71c487f" containerID="5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00" exitCode=0 Feb 26 16:30:53 crc kubenswrapper[4907]: I0226 16:30:53.052848 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerDied","Data":"5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00"} Feb 26 16:30:54 crc kubenswrapper[4907]: I0226 16:30:54.064445 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerStarted","Data":"22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72"} Feb 26 16:30:54 crc kubenswrapper[4907]: I0226 16:30:54.094820 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z82rd" podStartSLOduration=3.5836164679999998 podStartE2EDuration="7.094797084s" podCreationTimestamp="2026-02-26 16:30:47 +0000 UTC" firstStartedPulling="2026-02-26 16:30:50.010343901 +0000 UTC m=+2912.528905750" lastFinishedPulling="2026-02-26 16:30:53.521524517 +0000 UTC m=+2916.040086366" observedRunningTime="2026-02-26 16:30:54.084656596 +0000 UTC m=+2916.603218455" watchObservedRunningTime="2026-02-26 16:30:54.094797084 +0000 UTC m=+2916.613358943" Feb 26 16:30:58 crc kubenswrapper[4907]: I0226 16:30:58.310232 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:58 crc kubenswrapper[4907]: I0226 16:30:58.310641 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:58 crc kubenswrapper[4907]: I0226 16:30:58.363155 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:59 crc kubenswrapper[4907]: I0226 16:30:59.156440 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:30:59 crc kubenswrapper[4907]: I0226 16:30:59.209283 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z82rd"] Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.138038 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z82rd" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="registry-server" containerID="cri-o://22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72" gracePeriod=2 Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.620260 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.722020 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-catalog-content\") pod \"2902850b-3996-48a4-b518-206ea71c487f\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.726777 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nvn2\" (UniqueName: \"kubernetes.io/projected/2902850b-3996-48a4-b518-206ea71c487f-kube-api-access-5nvn2\") pod \"2902850b-3996-48a4-b518-206ea71c487f\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.727032 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-utilities\") pod \"2902850b-3996-48a4-b518-206ea71c487f\" (UID: \"2902850b-3996-48a4-b518-206ea71c487f\") " Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.727984 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-utilities" (OuterVolumeSpecName: "utilities") pod "2902850b-3996-48a4-b518-206ea71c487f" (UID: "2902850b-3996-48a4-b518-206ea71c487f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.728295 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.733175 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2902850b-3996-48a4-b518-206ea71c487f-kube-api-access-5nvn2" (OuterVolumeSpecName: "kube-api-access-5nvn2") pod "2902850b-3996-48a4-b518-206ea71c487f" (UID: "2902850b-3996-48a4-b518-206ea71c487f"). InnerVolumeSpecName "kube-api-access-5nvn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.761730 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2902850b-3996-48a4-b518-206ea71c487f" (UID: "2902850b-3996-48a4-b518-206ea71c487f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.830182 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2902850b-3996-48a4-b518-206ea71c487f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:01 crc kubenswrapper[4907]: I0226 16:31:01.830214 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nvn2\" (UniqueName: \"kubernetes.io/projected/2902850b-3996-48a4-b518-206ea71c487f-kube-api-access-5nvn2\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.166267 4907 generic.go:334] "Generic (PLEG): container finished" podID="2902850b-3996-48a4-b518-206ea71c487f" containerID="22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72" exitCode=0 Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.166428 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerDied","Data":"22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72"} Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.166779 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z82rd" event={"ID":"2902850b-3996-48a4-b518-206ea71c487f","Type":"ContainerDied","Data":"445694260dac7d0cb8b52025bd7455d283bd9f91d419902562b265f0f3db88fc"} Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.166818 4907 scope.go:117] "RemoveContainer" containerID="22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.166448 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z82rd" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.189679 4907 scope.go:117] "RemoveContainer" containerID="5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.215002 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z82rd"] Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.224703 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z82rd"] Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.237456 4907 scope.go:117] "RemoveContainer" containerID="6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.283753 4907 scope.go:117] "RemoveContainer" containerID="22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72" Feb 26 16:31:02 crc kubenswrapper[4907]: E0226 16:31:02.284229 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72\": container with ID starting with 22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72 not found: ID does not exist" containerID="22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.284267 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72"} err="failed to get container status \"22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72\": rpc error: code = NotFound desc = could not find container \"22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72\": container with ID starting with 22198daefd42c79990d222a6b90b81bfe9399a96288ebf165a0027724480fa72 not found: ID does not exist" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.284290 4907 scope.go:117] "RemoveContainer" containerID="5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00" Feb 26 16:31:02 crc kubenswrapper[4907]: E0226 16:31:02.284534 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00\": container with ID starting with 5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00 not found: ID does not exist" containerID="5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.284560 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00"} err="failed to get container status \"5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00\": rpc error: code = NotFound desc = could not find container \"5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00\": container with ID starting with 5ab6e4cc29da44b828f7a631c21cdf5d74abf071a24731b530ff1e94ceac5e00 not found: ID does not exist" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.284603 4907 scope.go:117] "RemoveContainer" containerID="6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2" Feb 26 16:31:02 crc kubenswrapper[4907]: E0226 16:31:02.284881 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2\": container with ID starting with 6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2 not found: ID does not exist" containerID="6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.284916 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2"} err="failed to get container status \"6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2\": rpc error: code = NotFound desc = could not find container \"6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2\": container with ID starting with 6a0c9332be6bc62b4b34641a5f007ca4dc6a61aa39371a6a7ff661cd601790b2 not found: ID does not exist" Feb 26 16:31:02 crc kubenswrapper[4907]: I0226 16:31:02.973338 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wp69c" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" probeResult="failure" output=< Feb 26 16:31:02 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:31:02 crc kubenswrapper[4907]: > Feb 26 16:31:04 crc kubenswrapper[4907]: I0226 16:31:04.140845 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2902850b-3996-48a4-b518-206ea71c487f" path="/var/lib/kubelet/pods/2902850b-3996-48a4-b518-206ea71c487f/volumes" Feb 26 16:31:12 crc kubenswrapper[4907]: I0226 16:31:12.976098 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wp69c" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" probeResult="failure" output=< Feb 26 16:31:12 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:31:12 crc kubenswrapper[4907]: > Feb 26 16:31:18 crc kubenswrapper[4907]: I0226 16:31:18.317113 4907 generic.go:334] "Generic (PLEG): container finished" podID="2483c310-db88-4757-857d-91e2815bbe67" containerID="01a3caffe83da211057576d6a9486c51b6734df195fcf6c515571dd452c2693a" exitCode=0 Feb 26 16:31:18 crc kubenswrapper[4907]: I0226 16:31:18.317197 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" event={"ID":"2483c310-db88-4757-857d-91e2815bbe67","Type":"ContainerDied","Data":"01a3caffe83da211057576d6a9486c51b6734df195fcf6c515571dd452c2693a"} Feb 26 16:31:18 crc kubenswrapper[4907]: I0226 16:31:18.530797 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:31:18 crc kubenswrapper[4907]: I0226 16:31:18.530879 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.732071 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.768955 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-inventory\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.769154 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-0\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.769190 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvnt7\" (UniqueName: \"kubernetes.io/projected/2483c310-db88-4757-857d-91e2815bbe67-kube-api-access-rvnt7\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.769233 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-telemetry-combined-ca-bundle\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.769260 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ssh-key-openstack-edpm-ipam\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.769301 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-2\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.782865 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2483c310-db88-4757-857d-91e2815bbe67-kube-api-access-rvnt7" (OuterVolumeSpecName: "kube-api-access-rvnt7") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "kube-api-access-rvnt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.786518 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.798460 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-inventory" (OuterVolumeSpecName: "inventory") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.799468 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.814077 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.816959 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870391 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-1\") pod \"2483c310-db88-4757-857d-91e2815bbe67\" (UID: \"2483c310-db88-4757-857d-91e2815bbe67\") " Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870837 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870859 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvnt7\" (UniqueName: \"kubernetes.io/projected/2483c310-db88-4757-857d-91e2815bbe67-kube-api-access-rvnt7\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870870 4907 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870879 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870889 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.870899 4907 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.899961 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "2483c310-db88-4757-857d-91e2815bbe67" (UID: "2483c310-db88-4757-857d-91e2815bbe67"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:31:19 crc kubenswrapper[4907]: I0226 16:31:19.972296 4907 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/2483c310-db88-4757-857d-91e2815bbe67-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:20 crc kubenswrapper[4907]: I0226 16:31:20.332229 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" event={"ID":"2483c310-db88-4757-857d-91e2815bbe67","Type":"ContainerDied","Data":"0b00de7d2871974b72b0c4e4ea8b859344cc991f6f3136d4d7a2e5c8b358dd7d"} Feb 26 16:31:20 crc kubenswrapper[4907]: I0226 16:31:20.332270 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b00de7d2871974b72b0c4e4ea8b859344cc991f6f3136d4d7a2e5c8b358dd7d" Feb 26 16:31:20 crc kubenswrapper[4907]: I0226 16:31:20.332358 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2" Feb 26 16:31:21 crc kubenswrapper[4907]: I0226 16:31:21.978064 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:31:22 crc kubenswrapper[4907]: I0226 16:31:22.028221 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:31:22 crc kubenswrapper[4907]: I0226 16:31:22.219148 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wp69c"] Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.357039 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wp69c" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" containerID="cri-o://fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1" gracePeriod=2 Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.799873 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.892250 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-catalog-content\") pod \"6c04acb1-885d-4771-8664-63da44063a01\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.892445 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-utilities\") pod \"6c04acb1-885d-4771-8664-63da44063a01\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.892547 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84vsj\" (UniqueName: \"kubernetes.io/projected/6c04acb1-885d-4771-8664-63da44063a01-kube-api-access-84vsj\") pod \"6c04acb1-885d-4771-8664-63da44063a01\" (UID: \"6c04acb1-885d-4771-8664-63da44063a01\") " Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.894636 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-utilities" (OuterVolumeSpecName: "utilities") pod "6c04acb1-885d-4771-8664-63da44063a01" (UID: "6c04acb1-885d-4771-8664-63da44063a01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.917295 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c04acb1-885d-4771-8664-63da44063a01-kube-api-access-84vsj" (OuterVolumeSpecName: "kube-api-access-84vsj") pod "6c04acb1-885d-4771-8664-63da44063a01" (UID: "6c04acb1-885d-4771-8664-63da44063a01"). InnerVolumeSpecName "kube-api-access-84vsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.995145 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:23 crc kubenswrapper[4907]: I0226 16:31:23.995363 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84vsj\" (UniqueName: \"kubernetes.io/projected/6c04acb1-885d-4771-8664-63da44063a01-kube-api-access-84vsj\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.034028 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6c04acb1-885d-4771-8664-63da44063a01" (UID: "6c04acb1-885d-4771-8664-63da44063a01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.096909 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6c04acb1-885d-4771-8664-63da44063a01-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.369352 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c04acb1-885d-4771-8664-63da44063a01" containerID="fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1" exitCode=0 Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.369398 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerDied","Data":"fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1"} Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.369425 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wp69c" event={"ID":"6c04acb1-885d-4771-8664-63da44063a01","Type":"ContainerDied","Data":"8957b8c5ef70e79de063883da563c147a782172b5d054dfb667bad43c8ddc26c"} Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.369441 4907 scope.go:117] "RemoveContainer" containerID="fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.369450 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wp69c" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.402979 4907 scope.go:117] "RemoveContainer" containerID="ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.426305 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wp69c"] Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.437204 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wp69c"] Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.443340 4907 scope.go:117] "RemoveContainer" containerID="3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.480622 4907 scope.go:117] "RemoveContainer" containerID="fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1" Feb 26 16:31:24 crc kubenswrapper[4907]: E0226 16:31:24.481091 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1\": container with ID starting with fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1 not found: ID does not exist" containerID="fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.481122 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1"} err="failed to get container status \"fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1\": rpc error: code = NotFound desc = could not find container \"fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1\": container with ID starting with fe754d8fe7962d4aadae672e889e78d398267f6494e7d40a61fd69f060c9f0c1 not found: ID does not exist" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.481143 4907 scope.go:117] "RemoveContainer" containerID="ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c" Feb 26 16:31:24 crc kubenswrapper[4907]: E0226 16:31:24.481640 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c\": container with ID starting with ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c not found: ID does not exist" containerID="ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.481771 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c"} err="failed to get container status \"ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c\": rpc error: code = NotFound desc = could not find container \"ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c\": container with ID starting with ca9b0cddfa2ee5a78ab5c5e065bd00951fca02acbbc0ff2b612c63d89039d18c not found: ID does not exist" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.481827 4907 scope.go:117] "RemoveContainer" containerID="3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a" Feb 26 16:31:24 crc kubenswrapper[4907]: E0226 16:31:24.482258 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a\": container with ID starting with 3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a not found: ID does not exist" containerID="3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a" Feb 26 16:31:24 crc kubenswrapper[4907]: I0226 16:31:24.482284 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a"} err="failed to get container status \"3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a\": rpc error: code = NotFound desc = could not find container \"3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a\": container with ID starting with 3946e8c8127cfcdf0dfef233f0c7eb534e02fdb360ad89981860a3fca7e9fd7a not found: ID does not exist" Feb 26 16:31:26 crc kubenswrapper[4907]: I0226 16:31:26.141848 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c04acb1-885d-4771-8664-63da44063a01" path="/var/lib/kubelet/pods/6c04acb1-885d-4771-8664-63da44063a01/volumes" Feb 26 16:31:48 crc kubenswrapper[4907]: I0226 16:31:48.530514 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:31:48 crc kubenswrapper[4907]: I0226 16:31:48.531079 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:31:48 crc kubenswrapper[4907]: I0226 16:31:48.531156 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:31:48 crc kubenswrapper[4907]: I0226 16:31:48.531792 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:31:48 crc kubenswrapper[4907]: I0226 16:31:48.531836 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" gracePeriod=600 Feb 26 16:31:48 crc kubenswrapper[4907]: E0226 16:31:48.657393 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:31:49 crc kubenswrapper[4907]: I0226 16:31:49.614222 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" exitCode=0 Feb 26 16:31:49 crc kubenswrapper[4907]: I0226 16:31:49.614278 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd"} Feb 26 16:31:49 crc kubenswrapper[4907]: I0226 16:31:49.614729 4907 scope.go:117] "RemoveContainer" containerID="75b5efd2017cdc33ecaf179acb64b81d3ecdff3a0779fa753362a3be77de0f3d" Feb 26 16:31:49 crc kubenswrapper[4907]: I0226 16:31:49.615447 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:31:49 crc kubenswrapper[4907]: E0226 16:31:49.615833 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.160300 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535392-5jtj5"] Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161651 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="extract-utilities" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161675 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="extract-utilities" Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161699 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="registry-server" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161710 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="registry-server" Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161721 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="extract-content" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161732 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="extract-content" Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161762 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="extract-utilities" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161772 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="extract-utilities" Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161789 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="extract-content" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161799 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="extract-content" Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161816 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2483c310-db88-4757-857d-91e2815bbe67" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161829 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="2483c310-db88-4757-857d-91e2815bbe67" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 16:32:00 crc kubenswrapper[4907]: E0226 16:32:00.161850 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.161862 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.162149 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2483c310-db88-4757-857d-91e2815bbe67" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.162174 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="2902850b-3996-48a4-b518-206ea71c487f" containerName="registry-server" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.162204 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c04acb1-885d-4771-8664-63da44063a01" containerName="registry-server" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.163133 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.167037 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.170386 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.170507 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.175223 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-5jtj5"] Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.266047 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v727s\" (UniqueName: \"kubernetes.io/projected/cff0efd2-1485-42a4-9fe1-b27fc59c3322-kube-api-access-v727s\") pod \"auto-csr-approver-29535392-5jtj5\" (UID: \"cff0efd2-1485-42a4-9fe1-b27fc59c3322\") " pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.368077 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v727s\" (UniqueName: \"kubernetes.io/projected/cff0efd2-1485-42a4-9fe1-b27fc59c3322-kube-api-access-v727s\") pod \"auto-csr-approver-29535392-5jtj5\" (UID: \"cff0efd2-1485-42a4-9fe1-b27fc59c3322\") " pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.388736 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v727s\" (UniqueName: \"kubernetes.io/projected/cff0efd2-1485-42a4-9fe1-b27fc59c3322-kube-api-access-v727s\") pod \"auto-csr-approver-29535392-5jtj5\" (UID: \"cff0efd2-1485-42a4-9fe1-b27fc59c3322\") " pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.483224 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:00 crc kubenswrapper[4907]: I0226 16:32:00.962748 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-5jtj5"] Feb 26 16:32:01 crc kubenswrapper[4907]: I0226 16:32:01.714440 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" event={"ID":"cff0efd2-1485-42a4-9fe1-b27fc59c3322","Type":"ContainerStarted","Data":"f19b893ebe8565488cb9d50e82e5a315c1c77e9ca0d29652bd4b77a123d85727"} Feb 26 16:32:02 crc kubenswrapper[4907]: I0226 16:32:02.732910 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" event={"ID":"cff0efd2-1485-42a4-9fe1-b27fc59c3322","Type":"ContainerStarted","Data":"72d1527220d87a01caf0e776ae54b1d090953c76745f6cccf52c73f47d1e0ba8"} Feb 26 16:32:02 crc kubenswrapper[4907]: I0226 16:32:02.748637 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" podStartSLOduration=1.53650504 podStartE2EDuration="2.748619358s" podCreationTimestamp="2026-02-26 16:32:00 +0000 UTC" firstStartedPulling="2026-02-26 16:32:00.962991926 +0000 UTC m=+2983.481553795" lastFinishedPulling="2026-02-26 16:32:02.175106264 +0000 UTC m=+2984.693668113" observedRunningTime="2026-02-26 16:32:02.745403799 +0000 UTC m=+2985.263965648" watchObservedRunningTime="2026-02-26 16:32:02.748619358 +0000 UTC m=+2985.267181207" Feb 26 16:32:03 crc kubenswrapper[4907]: I0226 16:32:03.126810 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:32:03 crc kubenswrapper[4907]: E0226 16:32:03.127103 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:32:03 crc kubenswrapper[4907]: I0226 16:32:03.745322 4907 generic.go:334] "Generic (PLEG): container finished" podID="cff0efd2-1485-42a4-9fe1-b27fc59c3322" containerID="72d1527220d87a01caf0e776ae54b1d090953c76745f6cccf52c73f47d1e0ba8" exitCode=0 Feb 26 16:32:03 crc kubenswrapper[4907]: I0226 16:32:03.745397 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" event={"ID":"cff0efd2-1485-42a4-9fe1-b27fc59c3322","Type":"ContainerDied","Data":"72d1527220d87a01caf0e776ae54b1d090953c76745f6cccf52c73f47d1e0ba8"} Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.085782 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.156330 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v727s\" (UniqueName: \"kubernetes.io/projected/cff0efd2-1485-42a4-9fe1-b27fc59c3322-kube-api-access-v727s\") pod \"cff0efd2-1485-42a4-9fe1-b27fc59c3322\" (UID: \"cff0efd2-1485-42a4-9fe1-b27fc59c3322\") " Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.177727 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cff0efd2-1485-42a4-9fe1-b27fc59c3322-kube-api-access-v727s" (OuterVolumeSpecName: "kube-api-access-v727s") pod "cff0efd2-1485-42a4-9fe1-b27fc59c3322" (UID: "cff0efd2-1485-42a4-9fe1-b27fc59c3322"). InnerVolumeSpecName "kube-api-access-v727s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.263871 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v727s\" (UniqueName: \"kubernetes.io/projected/cff0efd2-1485-42a4-9fe1-b27fc59c3322-kube-api-access-v727s\") on node \"crc\" DevicePath \"\"" Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.770220 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" event={"ID":"cff0efd2-1485-42a4-9fe1-b27fc59c3322","Type":"ContainerDied","Data":"f19b893ebe8565488cb9d50e82e5a315c1c77e9ca0d29652bd4b77a123d85727"} Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.770270 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535392-5jtj5" Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.770279 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f19b893ebe8565488cb9d50e82e5a315c1c77e9ca0d29652bd4b77a123d85727" Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.820177 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-bjv9s"] Feb 26 16:32:05 crc kubenswrapper[4907]: I0226 16:32:05.828183 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535386-bjv9s"] Feb 26 16:32:06 crc kubenswrapper[4907]: I0226 16:32:06.143466 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f41544-9fae-45ec-9e99-60598164470b" path="/var/lib/kubelet/pods/34f41544-9fae-45ec-9e99-60598164470b/volumes" Feb 26 16:32:11 crc kubenswrapper[4907]: I0226 16:32:11.472934 4907 scope.go:117] "RemoveContainer" containerID="baecd666444d528873510e3392a5af31aee1d1f70e657bff9ab36085882db7b3" Feb 26 16:32:17 crc kubenswrapper[4907]: I0226 16:32:17.126237 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:32:17 crc kubenswrapper[4907]: E0226 16:32:17.127198 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.598070 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 16:32:18 crc kubenswrapper[4907]: E0226 16:32:18.598573 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cff0efd2-1485-42a4-9fe1-b27fc59c3322" containerName="oc" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.598603 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="cff0efd2-1485-42a4-9fe1-b27fc59c3322" containerName="oc" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.598793 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="cff0efd2-1485-42a4-9fe1-b27fc59c3322" containerName="oc" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.599442 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.604071 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.604439 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.605577 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.605893 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bhz5c" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.610722 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.756465 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.756529 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.756580 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhp9c\" (UniqueName: \"kubernetes.io/projected/b09f4f4d-8644-4923-ab26-849b249efd4e-kube-api-access-zhp9c\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.756636 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.756664 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.756906 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.757043 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.757134 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-config-data\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.757331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859392 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhp9c\" (UniqueName: \"kubernetes.io/projected/b09f4f4d-8644-4923-ab26-849b249efd4e-kube-api-access-zhp9c\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859452 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859474 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859535 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859555 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859580 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-config-data\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859641 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.859685 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.860297 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.860617 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.860953 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.861967 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.862242 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-config-data\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.866417 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.869050 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.869420 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.881023 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhp9c\" (UniqueName: \"kubernetes.io/projected/b09f4f4d-8644-4923-ab26-849b249efd4e-kube-api-access-zhp9c\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.897621 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " pod="openstack/tempest-tests-tempest" Feb 26 16:32:18 crc kubenswrapper[4907]: I0226 16:32:18.933865 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 16:32:19 crc kubenswrapper[4907]: I0226 16:32:19.355444 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 16:32:19 crc kubenswrapper[4907]: I0226 16:32:19.902677 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b09f4f4d-8644-4923-ab26-849b249efd4e","Type":"ContainerStarted","Data":"b29969f448302f851d6e52e5dc17f917fb3d586f2bbd97cecbba721765f6853b"} Feb 26 16:32:30 crc kubenswrapper[4907]: I0226 16:32:30.127433 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:32:30 crc kubenswrapper[4907]: E0226 16:32:30.128293 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:32:41 crc kubenswrapper[4907]: I0226 16:32:41.126791 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:32:41 crc kubenswrapper[4907]: E0226 16:32:41.127549 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:32:54 crc kubenswrapper[4907]: I0226 16:32:54.128983 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:32:54 crc kubenswrapper[4907]: E0226 16:32:54.129750 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:32:55 crc kubenswrapper[4907]: E0226 16:32:55.166089 4907 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 26 16:32:55 crc kubenswrapper[4907]: E0226 16:32:55.172142 4907 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhp9c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b09f4f4d-8644-4923-ab26-849b249efd4e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 16:32:55 crc kubenswrapper[4907]: E0226 16:32:55.173510 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b09f4f4d-8644-4923-ab26-849b249efd4e" Feb 26 16:32:55 crc kubenswrapper[4907]: E0226 16:32:55.353858 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b09f4f4d-8644-4923-ab26-849b249efd4e" Feb 26 16:33:07 crc kubenswrapper[4907]: I0226 16:33:07.127355 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:33:07 crc kubenswrapper[4907]: E0226 16:33:07.127975 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:33:08 crc kubenswrapper[4907]: I0226 16:33:08.493770 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b09f4f4d-8644-4923-ab26-849b249efd4e","Type":"ContainerStarted","Data":"c116b18fcd326f5904a54768ebbb024d441f6470af2b366848af836d8078faf7"} Feb 26 16:33:08 crc kubenswrapper[4907]: I0226 16:33:08.537055 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.34969044 podStartE2EDuration="51.537033957s" podCreationTimestamp="2026-02-26 16:32:17 +0000 UTC" firstStartedPulling="2026-02-26 16:32:19.368030935 +0000 UTC m=+3001.886592784" lastFinishedPulling="2026-02-26 16:33:06.555374452 +0000 UTC m=+3049.073936301" observedRunningTime="2026-02-26 16:33:08.521828375 +0000 UTC m=+3051.040390234" watchObservedRunningTime="2026-02-26 16:33:08.537033957 +0000 UTC m=+3051.055595846" Feb 26 16:33:19 crc kubenswrapper[4907]: I0226 16:33:19.127043 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:33:19 crc kubenswrapper[4907]: E0226 16:33:19.127842 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:33:31 crc kubenswrapper[4907]: I0226 16:33:31.126889 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:33:31 crc kubenswrapper[4907]: E0226 16:33:31.127730 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:33:42 crc kubenswrapper[4907]: I0226 16:33:42.839290 4907 generic.go:334] "Generic (PLEG): container finished" podID="b09f4f4d-8644-4923-ab26-849b249efd4e" containerID="c116b18fcd326f5904a54768ebbb024d441f6470af2b366848af836d8078faf7" exitCode=123 Feb 26 16:33:42 crc kubenswrapper[4907]: I0226 16:33:42.839414 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b09f4f4d-8644-4923-ab26-849b249efd4e","Type":"ContainerDied","Data":"c116b18fcd326f5904a54768ebbb024d441f6470af2b366848af836d8078faf7"} Feb 26 16:33:43 crc kubenswrapper[4907]: I0226 16:33:43.127121 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:33:43 crc kubenswrapper[4907]: E0226 16:33:43.127433 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.284620 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.405938 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-workdir\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406010 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ca-certs\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhp9c\" (UniqueName: \"kubernetes.io/projected/b09f4f4d-8644-4923-ab26-849b249efd4e-kube-api-access-zhp9c\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406161 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config-secret\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406264 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-temporary\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406301 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ssh-key\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406315 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.406354 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-config-data\") pod \"b09f4f4d-8644-4923-ab26-849b249efd4e\" (UID: \"b09f4f4d-8644-4923-ab26-849b249efd4e\") " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.407027 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.407629 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-config-data" (OuterVolumeSpecName: "config-data") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.408048 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.410872 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.412044 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09f4f4d-8644-4923-ab26-849b249efd4e-kube-api-access-zhp9c" (OuterVolumeSpecName: "kube-api-access-zhp9c") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "kube-api-access-zhp9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.435074 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.436720 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.441820 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.471280 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b09f4f4d-8644-4923-ab26-849b249efd4e" (UID: "b09f4f4d-8644-4923-ab26-849b249efd4e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508727 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508759 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508773 4907 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508801 4907 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508815 4907 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508829 4907 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b09f4f4d-8644-4923-ab26-849b249efd4e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508842 4907 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b09f4f4d-8644-4923-ab26-849b249efd4e-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508853 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhp9c\" (UniqueName: \"kubernetes.io/projected/b09f4f4d-8644-4923-ab26-849b249efd4e-kube-api-access-zhp9c\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.508863 4907 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b09f4f4d-8644-4923-ab26-849b249efd4e-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.530183 4907 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.611990 4907 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.859355 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b09f4f4d-8644-4923-ab26-849b249efd4e","Type":"ContainerDied","Data":"b29969f448302f851d6e52e5dc17f917fb3d586f2bbd97cecbba721765f6853b"} Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.859392 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29969f448302f851d6e52e5dc17f917fb3d586f2bbd97cecbba721765f6853b" Feb 26 16:33:44 crc kubenswrapper[4907]: I0226 16:33:44.859452 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.137225 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2jrgb"] Feb 26 16:33:48 crc kubenswrapper[4907]: E0226 16:33:48.137909 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09f4f4d-8644-4923-ab26-849b249efd4e" containerName="tempest-tests-tempest-tests-runner" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.137924 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09f4f4d-8644-4923-ab26-849b249efd4e" containerName="tempest-tests-tempest-tests-runner" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.138498 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09f4f4d-8644-4923-ab26-849b249efd4e" containerName="tempest-tests-tempest-tests-runner" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.140826 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jrgb"] Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.140921 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.184925 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp6hq\" (UniqueName: \"kubernetes.io/projected/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-kube-api-access-vp6hq\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.185292 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-catalog-content\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.185331 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-utilities\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.286955 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp6hq\" (UniqueName: \"kubernetes.io/projected/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-kube-api-access-vp6hq\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.287005 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-catalog-content\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.287035 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-utilities\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.287547 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-catalog-content\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.287582 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-utilities\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.314569 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp6hq\" (UniqueName: \"kubernetes.io/projected/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-kube-api-access-vp6hq\") pod \"certified-operators-2jrgb\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:48 crc kubenswrapper[4907]: I0226 16:33:48.470450 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:49 crc kubenswrapper[4907]: I0226 16:33:49.004779 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2jrgb"] Feb 26 16:33:49 crc kubenswrapper[4907]: I0226 16:33:49.909130 4907 generic.go:334] "Generic (PLEG): container finished" podID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerID="6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b" exitCode=0 Feb 26 16:33:49 crc kubenswrapper[4907]: I0226 16:33:49.909309 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerDied","Data":"6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b"} Feb 26 16:33:49 crc kubenswrapper[4907]: I0226 16:33:49.909447 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerStarted","Data":"9c201f4b0a67fd36c94c02416572378871bb8786b2f9604920ddb404473721da"} Feb 26 16:33:50 crc kubenswrapper[4907]: I0226 16:33:50.919548 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerStarted","Data":"318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c"} Feb 26 16:33:52 crc kubenswrapper[4907]: I0226 16:33:52.936966 4907 generic.go:334] "Generic (PLEG): container finished" podID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerID="318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c" exitCode=0 Feb 26 16:33:52 crc kubenswrapper[4907]: I0226 16:33:52.937274 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerDied","Data":"318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c"} Feb 26 16:33:53 crc kubenswrapper[4907]: I0226 16:33:53.949796 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerStarted","Data":"60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d"} Feb 26 16:33:53 crc kubenswrapper[4907]: I0226 16:33:53.972965 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2jrgb" podStartSLOduration=2.290849147 podStartE2EDuration="5.972945731s" podCreationTimestamp="2026-02-26 16:33:48 +0000 UTC" firstStartedPulling="2026-02-26 16:33:49.913874469 +0000 UTC m=+3092.432436318" lastFinishedPulling="2026-02-26 16:33:53.595971053 +0000 UTC m=+3096.114532902" observedRunningTime="2026-02-26 16:33:53.966618526 +0000 UTC m=+3096.485180375" watchObservedRunningTime="2026-02-26 16:33:53.972945731 +0000 UTC m=+3096.491507570" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.065636 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.068339 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.071690 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bhz5c" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.079154 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.127098 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:33:56 crc kubenswrapper[4907]: E0226 16:33:56.127378 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.140138 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwx5\" (UniqueName: \"kubernetes.io/projected/e10c0113-d917-4c4a-be56-0e234e61e744-kube-api-access-nwwx5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.140455 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.242867 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwx5\" (UniqueName: \"kubernetes.io/projected/e10c0113-d917-4c4a-be56-0e234e61e744-kube-api-access-nwwx5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.242980 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.244328 4907 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.271933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwx5\" (UniqueName: \"kubernetes.io/projected/e10c0113-d917-4c4a-be56-0e234e61e744-kube-api-access-nwwx5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.277287 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e10c0113-d917-4c4a-be56-0e234e61e744\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.395685 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.826548 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 16:33:56 crc kubenswrapper[4907]: I0226 16:33:56.980178 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e10c0113-d917-4c4a-be56-0e234e61e744","Type":"ContainerStarted","Data":"37ff7932a3a86aa00849926d02c47bd205d3d235bff1358eb12003c33de492a6"} Feb 26 16:33:58 crc kubenswrapper[4907]: I0226 16:33:58.003046 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e10c0113-d917-4c4a-be56-0e234e61e744","Type":"ContainerStarted","Data":"87ddfd356b2e8d87d290b0e3f1d3b1e984a551794fca8596eda5726c36e94c6d"} Feb 26 16:33:58 crc kubenswrapper[4907]: I0226 16:33:58.035659 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.183623999 podStartE2EDuration="2.03557921s" podCreationTimestamp="2026-02-26 16:33:56 +0000 UTC" firstStartedPulling="2026-02-26 16:33:56.832877642 +0000 UTC m=+3099.351439491" lastFinishedPulling="2026-02-26 16:33:57.684832853 +0000 UTC m=+3100.203394702" observedRunningTime="2026-02-26 16:33:58.021258559 +0000 UTC m=+3100.539820478" watchObservedRunningTime="2026-02-26 16:33:58.03557921 +0000 UTC m=+3100.554141099" Feb 26 16:33:58 crc kubenswrapper[4907]: I0226 16:33:58.471129 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:58 crc kubenswrapper[4907]: I0226 16:33:58.471267 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:58 crc kubenswrapper[4907]: I0226 16:33:58.564696 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:33:59 crc kubenswrapper[4907]: I0226 16:33:59.066293 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.152950 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535394-bp54j"] Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.154399 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.158074 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.158412 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.161414 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.164212 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-bp54j"] Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.223867 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcw4\" (UniqueName: \"kubernetes.io/projected/bfe9cc80-ce34-4697-863f-3da3548bfc20-kube-api-access-vlcw4\") pod \"auto-csr-approver-29535394-bp54j\" (UID: \"bfe9cc80-ce34-4697-863f-3da3548bfc20\") " pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.327373 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcw4\" (UniqueName: \"kubernetes.io/projected/bfe9cc80-ce34-4697-863f-3da3548bfc20-kube-api-access-vlcw4\") pod \"auto-csr-approver-29535394-bp54j\" (UID: \"bfe9cc80-ce34-4697-863f-3da3548bfc20\") " pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.351250 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcw4\" (UniqueName: \"kubernetes.io/projected/bfe9cc80-ce34-4697-863f-3da3548bfc20-kube-api-access-vlcw4\") pod \"auto-csr-approver-29535394-bp54j\" (UID: \"bfe9cc80-ce34-4697-863f-3da3548bfc20\") " pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.472068 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:00 crc kubenswrapper[4907]: I0226 16:34:00.931303 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-bp54j"] Feb 26 16:34:01 crc kubenswrapper[4907]: I0226 16:34:01.032815 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-bp54j" event={"ID":"bfe9cc80-ce34-4697-863f-3da3548bfc20","Type":"ContainerStarted","Data":"ddeb12af411ec84c235e634fd661382d1f497011f35fd043c6233fcd6a38805b"} Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.287237 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2jrgb"] Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.288098 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2jrgb" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="registry-server" containerID="cri-o://60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d" gracePeriod=2 Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.801137 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.878460 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp6hq\" (UniqueName: \"kubernetes.io/projected/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-kube-api-access-vp6hq\") pod \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.878793 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-utilities\") pod \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.878850 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-catalog-content\") pod \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\" (UID: \"56f6c092-f3e0-4d9c-97e6-2a61de18bab5\") " Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.881153 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-utilities" (OuterVolumeSpecName: "utilities") pod "56f6c092-f3e0-4d9c-97e6-2a61de18bab5" (UID: "56f6c092-f3e0-4d9c-97e6-2a61de18bab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.895766 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-kube-api-access-vp6hq" (OuterVolumeSpecName: "kube-api-access-vp6hq") pod "56f6c092-f3e0-4d9c-97e6-2a61de18bab5" (UID: "56f6c092-f3e0-4d9c-97e6-2a61de18bab5"). InnerVolumeSpecName "kube-api-access-vp6hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.941829 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56f6c092-f3e0-4d9c-97e6-2a61de18bab5" (UID: "56f6c092-f3e0-4d9c-97e6-2a61de18bab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.980897 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.980940 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:34:02 crc kubenswrapper[4907]: I0226 16:34:02.980955 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp6hq\" (UniqueName: \"kubernetes.io/projected/56f6c092-f3e0-4d9c-97e6-2a61de18bab5-kube-api-access-vp6hq\") on node \"crc\" DevicePath \"\"" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.051263 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-bp54j" event={"ID":"bfe9cc80-ce34-4697-863f-3da3548bfc20","Type":"ContainerStarted","Data":"4eae3e31eff6bee3afe11499d6c56af76772ddfd3e6af4282ed587f4d8e1a0e8"} Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.054127 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2jrgb" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.054118 4907 generic.go:334] "Generic (PLEG): container finished" podID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerID="60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d" exitCode=0 Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.054161 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerDied","Data":"60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d"} Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.054393 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2jrgb" event={"ID":"56f6c092-f3e0-4d9c-97e6-2a61de18bab5","Type":"ContainerDied","Data":"9c201f4b0a67fd36c94c02416572378871bb8786b2f9604920ddb404473721da"} Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.054413 4907 scope.go:117] "RemoveContainer" containerID="60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.069551 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535394-bp54j" podStartSLOduration=2.210041122 podStartE2EDuration="3.069534179s" podCreationTimestamp="2026-02-26 16:34:00 +0000 UTC" firstStartedPulling="2026-02-26 16:34:00.935082997 +0000 UTC m=+3103.453644846" lastFinishedPulling="2026-02-26 16:34:01.794576054 +0000 UTC m=+3104.313137903" observedRunningTime="2026-02-26 16:34:03.066110285 +0000 UTC m=+3105.584672134" watchObservedRunningTime="2026-02-26 16:34:03.069534179 +0000 UTC m=+3105.588096028" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.091734 4907 scope.go:117] "RemoveContainer" containerID="318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.111213 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2jrgb"] Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.121877 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2jrgb"] Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.137947 4907 scope.go:117] "RemoveContainer" containerID="6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.186727 4907 scope.go:117] "RemoveContainer" containerID="60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d" Feb 26 16:34:03 crc kubenswrapper[4907]: E0226 16:34:03.187236 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d\": container with ID starting with 60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d not found: ID does not exist" containerID="60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.187266 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d"} err="failed to get container status \"60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d\": rpc error: code = NotFound desc = could not find container \"60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d\": container with ID starting with 60a2b38cd22122261c649beb41c46afc63220893fc0f57b166a950db1dcdf52d not found: ID does not exist" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.187285 4907 scope.go:117] "RemoveContainer" containerID="318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c" Feb 26 16:34:03 crc kubenswrapper[4907]: E0226 16:34:03.187548 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c\": container with ID starting with 318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c not found: ID does not exist" containerID="318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.187618 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c"} err="failed to get container status \"318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c\": rpc error: code = NotFound desc = could not find container \"318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c\": container with ID starting with 318280f0d4fcc7d1362b16964385015a1970d65a34c486895062e97bdc3baf2c not found: ID does not exist" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.187635 4907 scope.go:117] "RemoveContainer" containerID="6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b" Feb 26 16:34:03 crc kubenswrapper[4907]: E0226 16:34:03.187968 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b\": container with ID starting with 6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b not found: ID does not exist" containerID="6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b" Feb 26 16:34:03 crc kubenswrapper[4907]: I0226 16:34:03.187990 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b"} err="failed to get container status \"6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b\": rpc error: code = NotFound desc = could not find container \"6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b\": container with ID starting with 6b9640ce05d6bb681f9d8f8cd66b78238f79e7c8a2ac92e5668d5e69bb7a861b not found: ID does not exist" Feb 26 16:34:04 crc kubenswrapper[4907]: I0226 16:34:04.072963 4907 generic.go:334] "Generic (PLEG): container finished" podID="bfe9cc80-ce34-4697-863f-3da3548bfc20" containerID="4eae3e31eff6bee3afe11499d6c56af76772ddfd3e6af4282ed587f4d8e1a0e8" exitCode=0 Feb 26 16:34:04 crc kubenswrapper[4907]: I0226 16:34:04.073368 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-bp54j" event={"ID":"bfe9cc80-ce34-4697-863f-3da3548bfc20","Type":"ContainerDied","Data":"4eae3e31eff6bee3afe11499d6c56af76772ddfd3e6af4282ed587f4d8e1a0e8"} Feb 26 16:34:04 crc kubenswrapper[4907]: I0226 16:34:04.151636 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" path="/var/lib/kubelet/pods/56f6c092-f3e0-4d9c-97e6-2a61de18bab5/volumes" Feb 26 16:34:05 crc kubenswrapper[4907]: I0226 16:34:05.519954 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:05 crc kubenswrapper[4907]: I0226 16:34:05.654100 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlcw4\" (UniqueName: \"kubernetes.io/projected/bfe9cc80-ce34-4697-863f-3da3548bfc20-kube-api-access-vlcw4\") pod \"bfe9cc80-ce34-4697-863f-3da3548bfc20\" (UID: \"bfe9cc80-ce34-4697-863f-3da3548bfc20\") " Feb 26 16:34:05 crc kubenswrapper[4907]: I0226 16:34:05.659824 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfe9cc80-ce34-4697-863f-3da3548bfc20-kube-api-access-vlcw4" (OuterVolumeSpecName: "kube-api-access-vlcw4") pod "bfe9cc80-ce34-4697-863f-3da3548bfc20" (UID: "bfe9cc80-ce34-4697-863f-3da3548bfc20"). InnerVolumeSpecName "kube-api-access-vlcw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:34:05 crc kubenswrapper[4907]: I0226 16:34:05.756987 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlcw4\" (UniqueName: \"kubernetes.io/projected/bfe9cc80-ce34-4697-863f-3da3548bfc20-kube-api-access-vlcw4\") on node \"crc\" DevicePath \"\"" Feb 26 16:34:06 crc kubenswrapper[4907]: I0226 16:34:06.090291 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535394-bp54j" event={"ID":"bfe9cc80-ce34-4697-863f-3da3548bfc20","Type":"ContainerDied","Data":"ddeb12af411ec84c235e634fd661382d1f497011f35fd043c6233fcd6a38805b"} Feb 26 16:34:06 crc kubenswrapper[4907]: I0226 16:34:06.090530 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddeb12af411ec84c235e634fd661382d1f497011f35fd043c6233fcd6a38805b" Feb 26 16:34:06 crc kubenswrapper[4907]: I0226 16:34:06.090347 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535394-bp54j" Feb 26 16:34:06 crc kubenswrapper[4907]: I0226 16:34:06.176637 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-7pf82"] Feb 26 16:34:06 crc kubenswrapper[4907]: I0226 16:34:06.186080 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535388-7pf82"] Feb 26 16:34:08 crc kubenswrapper[4907]: I0226 16:34:08.139754 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9755fa96-5b0e-4088-88ec-70ec3bb6121d" path="/var/lib/kubelet/pods/9755fa96-5b0e-4088-88ec-70ec3bb6121d/volumes" Feb 26 16:34:10 crc kubenswrapper[4907]: I0226 16:34:10.129402 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:34:10 crc kubenswrapper[4907]: E0226 16:34:10.130022 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:34:11 crc kubenswrapper[4907]: I0226 16:34:11.619029 4907 scope.go:117] "RemoveContainer" containerID="a54196bad514725e5c6f92ebdd204cee3fe70e1b81698b178c2679a7877e398a" Feb 26 16:34:24 crc kubenswrapper[4907]: I0226 16:34:24.126626 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:34:24 crc kubenswrapper[4907]: E0226 16:34:24.127725 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.197073 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2htz6/must-gather-jzgwz"] Feb 26 16:34:26 crc kubenswrapper[4907]: E0226 16:34:26.198275 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="extract-utilities" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.198292 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="extract-utilities" Feb 26 16:34:26 crc kubenswrapper[4907]: E0226 16:34:26.198309 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="registry-server" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.198316 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="registry-server" Feb 26 16:34:26 crc kubenswrapper[4907]: E0226 16:34:26.198330 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="extract-content" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.198337 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="extract-content" Feb 26 16:34:26 crc kubenswrapper[4907]: E0226 16:34:26.198350 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfe9cc80-ce34-4697-863f-3da3548bfc20" containerName="oc" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.198356 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfe9cc80-ce34-4697-863f-3da3548bfc20" containerName="oc" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.198613 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfe9cc80-ce34-4697-863f-3da3548bfc20" containerName="oc" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.198632 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="56f6c092-f3e0-4d9c-97e6-2a61de18bab5" containerName="registry-server" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.200013 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.204335 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2htz6"/"openshift-service-ca.crt" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.204368 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-2htz6"/"kube-root-ca.crt" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.240473 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2htz6/must-gather-jzgwz"] Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.374060 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14587e07-76d8-408e-af38-0069fdd00ccd-must-gather-output\") pod \"must-gather-jzgwz\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.374480 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d54qq\" (UniqueName: \"kubernetes.io/projected/14587e07-76d8-408e-af38-0069fdd00ccd-kube-api-access-d54qq\") pod \"must-gather-jzgwz\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.476806 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d54qq\" (UniqueName: \"kubernetes.io/projected/14587e07-76d8-408e-af38-0069fdd00ccd-kube-api-access-d54qq\") pod \"must-gather-jzgwz\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.477028 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14587e07-76d8-408e-af38-0069fdd00ccd-must-gather-output\") pod \"must-gather-jzgwz\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.477459 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14587e07-76d8-408e-af38-0069fdd00ccd-must-gather-output\") pod \"must-gather-jzgwz\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.499440 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d54qq\" (UniqueName: \"kubernetes.io/projected/14587e07-76d8-408e-af38-0069fdd00ccd-kube-api-access-d54qq\") pod \"must-gather-jzgwz\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:26 crc kubenswrapper[4907]: I0226 16:34:26.584515 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:34:27 crc kubenswrapper[4907]: I0226 16:34:27.093297 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2htz6/must-gather-jzgwz"] Feb 26 16:34:27 crc kubenswrapper[4907]: I0226 16:34:27.365425 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/must-gather-jzgwz" event={"ID":"14587e07-76d8-408e-af38-0069fdd00ccd","Type":"ContainerStarted","Data":"f94ed108183cdc7150cf3512f72d6257b9de186f3fde230e590bf4e620edc94a"} Feb 26 16:34:37 crc kubenswrapper[4907]: I0226 16:34:37.127572 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:34:37 crc kubenswrapper[4907]: E0226 16:34:37.128731 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:34:37 crc kubenswrapper[4907]: I0226 16:34:37.483205 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/must-gather-jzgwz" event={"ID":"14587e07-76d8-408e-af38-0069fdd00ccd","Type":"ContainerStarted","Data":"54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47"} Feb 26 16:34:37 crc kubenswrapper[4907]: I0226 16:34:37.483286 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/must-gather-jzgwz" event={"ID":"14587e07-76d8-408e-af38-0069fdd00ccd","Type":"ContainerStarted","Data":"23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50"} Feb 26 16:34:37 crc kubenswrapper[4907]: I0226 16:34:37.500183 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2htz6/must-gather-jzgwz" podStartSLOduration=2.336127595 podStartE2EDuration="11.500158362s" podCreationTimestamp="2026-02-26 16:34:26 +0000 UTC" firstStartedPulling="2026-02-26 16:34:27.101155678 +0000 UTC m=+3129.619717527" lastFinishedPulling="2026-02-26 16:34:36.265186445 +0000 UTC m=+3138.783748294" observedRunningTime="2026-02-26 16:34:37.495724564 +0000 UTC m=+3140.014286413" watchObservedRunningTime="2026-02-26 16:34:37.500158362 +0000 UTC m=+3140.018720211" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.294114 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2htz6/crc-debug-gg2rb"] Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.295736 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.297728 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2htz6"/"default-dockercfg-4th5r" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.348770 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xssf\" (UniqueName: \"kubernetes.io/projected/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-kube-api-access-2xssf\") pod \"crc-debug-gg2rb\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.348911 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-host\") pod \"crc-debug-gg2rb\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.451039 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-host\") pod \"crc-debug-gg2rb\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.451194 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-host\") pod \"crc-debug-gg2rb\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.451204 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xssf\" (UniqueName: \"kubernetes.io/projected/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-kube-api-access-2xssf\") pod \"crc-debug-gg2rb\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.480784 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xssf\" (UniqueName: \"kubernetes.io/projected/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-kube-api-access-2xssf\") pod \"crc-debug-gg2rb\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:40 crc kubenswrapper[4907]: I0226 16:34:40.613085 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:34:41 crc kubenswrapper[4907]: I0226 16:34:41.517466 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" event={"ID":"7709dc5e-ebdc-4499-b8d2-1c10d7406e42","Type":"ContainerStarted","Data":"474865c7b3050e821a9287ade9a98748cfb1ea7aa76c75d099ae9d1dad317d41"} Feb 26 16:34:50 crc kubenswrapper[4907]: I0226 16:34:50.129369 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:34:50 crc kubenswrapper[4907]: E0226 16:34:50.131000 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:34:52 crc kubenswrapper[4907]: I0226 16:34:52.649950 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" event={"ID":"7709dc5e-ebdc-4499-b8d2-1c10d7406e42","Type":"ContainerStarted","Data":"b779c1853e304ab137b2e007a5a8ec149347f17d993d712d3e49d4ba65136395"} Feb 26 16:34:52 crc kubenswrapper[4907]: I0226 16:34:52.673943 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" podStartSLOduration=1.326766309 podStartE2EDuration="12.673920709s" podCreationTimestamp="2026-02-26 16:34:40 +0000 UTC" firstStartedPulling="2026-02-26 16:34:40.668021201 +0000 UTC m=+3143.186583050" lastFinishedPulling="2026-02-26 16:34:52.015175601 +0000 UTC m=+3154.533737450" observedRunningTime="2026-02-26 16:34:52.661951905 +0000 UTC m=+3155.180513754" watchObservedRunningTime="2026-02-26 16:34:52.673920709 +0000 UTC m=+3155.192482558" Feb 26 16:35:03 crc kubenswrapper[4907]: I0226 16:35:03.127259 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:35:03 crc kubenswrapper[4907]: E0226 16:35:03.128050 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:35:09 crc kubenswrapper[4907]: I0226 16:35:09.800507 4907 generic.go:334] "Generic (PLEG): container finished" podID="7709dc5e-ebdc-4499-b8d2-1c10d7406e42" containerID="b779c1853e304ab137b2e007a5a8ec149347f17d993d712d3e49d4ba65136395" exitCode=0 Feb 26 16:35:09 crc kubenswrapper[4907]: I0226 16:35:09.800610 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" event={"ID":"7709dc5e-ebdc-4499-b8d2-1c10d7406e42","Type":"ContainerDied","Data":"b779c1853e304ab137b2e007a5a8ec149347f17d993d712d3e49d4ba65136395"} Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.915797 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.954753 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2htz6/crc-debug-gg2rb"] Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.964392 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xssf\" (UniqueName: \"kubernetes.io/projected/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-kube-api-access-2xssf\") pod \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.964513 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-host\") pod \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\" (UID: \"7709dc5e-ebdc-4499-b8d2-1c10d7406e42\") " Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.964936 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-host" (OuterVolumeSpecName: "host") pod "7709dc5e-ebdc-4499-b8d2-1c10d7406e42" (UID: "7709dc5e-ebdc-4499-b8d2-1c10d7406e42"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.965213 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-host\") on node \"crc\" DevicePath \"\"" Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.969084 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2htz6/crc-debug-gg2rb"] Feb 26 16:35:10 crc kubenswrapper[4907]: I0226 16:35:10.977892 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-kube-api-access-2xssf" (OuterVolumeSpecName: "kube-api-access-2xssf") pod "7709dc5e-ebdc-4499-b8d2-1c10d7406e42" (UID: "7709dc5e-ebdc-4499-b8d2-1c10d7406e42"). InnerVolumeSpecName "kube-api-access-2xssf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:35:11 crc kubenswrapper[4907]: I0226 16:35:11.065880 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xssf\" (UniqueName: \"kubernetes.io/projected/7709dc5e-ebdc-4499-b8d2-1c10d7406e42-kube-api-access-2xssf\") on node \"crc\" DevicePath \"\"" Feb 26 16:35:11 crc kubenswrapper[4907]: I0226 16:35:11.822569 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="474865c7b3050e821a9287ade9a98748cfb1ea7aa76c75d099ae9d1dad317d41" Feb 26 16:35:11 crc kubenswrapper[4907]: I0226 16:35:11.822765 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-gg2rb" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.138484 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7709dc5e-ebdc-4499-b8d2-1c10d7406e42" path="/var/lib/kubelet/pods/7709dc5e-ebdc-4499-b8d2-1c10d7406e42/volumes" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.160994 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2htz6/crc-debug-lz85x"] Feb 26 16:35:12 crc kubenswrapper[4907]: E0226 16:35:12.161508 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7709dc5e-ebdc-4499-b8d2-1c10d7406e42" containerName="container-00" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.161534 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="7709dc5e-ebdc-4499-b8d2-1c10d7406e42" containerName="container-00" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.161783 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="7709dc5e-ebdc-4499-b8d2-1c10d7406e42" containerName="container-00" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.162551 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.177507 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-2htz6"/"default-dockercfg-4th5r" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.188352 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/63529314-dbdf-4e7c-9a00-e2e968c8522a-kube-api-access-tx5cq\") pod \"crc-debug-lz85x\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.188442 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63529314-dbdf-4e7c-9a00-e2e968c8522a-host\") pod \"crc-debug-lz85x\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.289729 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/63529314-dbdf-4e7c-9a00-e2e968c8522a-kube-api-access-tx5cq\") pod \"crc-debug-lz85x\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.289778 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63529314-dbdf-4e7c-9a00-e2e968c8522a-host\") pod \"crc-debug-lz85x\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.289967 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63529314-dbdf-4e7c-9a00-e2e968c8522a-host\") pod \"crc-debug-lz85x\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.309515 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/63529314-dbdf-4e7c-9a00-e2e968c8522a-kube-api-access-tx5cq\") pod \"crc-debug-lz85x\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.491283 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.832481 4907 generic.go:334] "Generic (PLEG): container finished" podID="63529314-dbdf-4e7c-9a00-e2e968c8522a" containerID="fc96c31e9a60596ace2c05d57e08d0789967d0b7be8e8ca055b984161638cc9c" exitCode=1 Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.832686 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/crc-debug-lz85x" event={"ID":"63529314-dbdf-4e7c-9a00-e2e968c8522a","Type":"ContainerDied","Data":"fc96c31e9a60596ace2c05d57e08d0789967d0b7be8e8ca055b984161638cc9c"} Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.832795 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/crc-debug-lz85x" event={"ID":"63529314-dbdf-4e7c-9a00-e2e968c8522a","Type":"ContainerStarted","Data":"8c5daa8f66f462449768ce16e476485ff64a5d262dfa3287e2b8211a00a2c902"} Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.869802 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2htz6/crc-debug-lz85x"] Feb 26 16:35:12 crc kubenswrapper[4907]: I0226 16:35:12.879394 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2htz6/crc-debug-lz85x"] Feb 26 16:35:13 crc kubenswrapper[4907]: I0226 16:35:13.955760 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.124472 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63529314-dbdf-4e7c-9a00-e2e968c8522a-host\") pod \"63529314-dbdf-4e7c-9a00-e2e968c8522a\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.124576 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/63529314-dbdf-4e7c-9a00-e2e968c8522a-kube-api-access-tx5cq\") pod \"63529314-dbdf-4e7c-9a00-e2e968c8522a\" (UID: \"63529314-dbdf-4e7c-9a00-e2e968c8522a\") " Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.124659 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63529314-dbdf-4e7c-9a00-e2e968c8522a-host" (OuterVolumeSpecName: "host") pod "63529314-dbdf-4e7c-9a00-e2e968c8522a" (UID: "63529314-dbdf-4e7c-9a00-e2e968c8522a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.125162 4907 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63529314-dbdf-4e7c-9a00-e2e968c8522a-host\") on node \"crc\" DevicePath \"\"" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.135050 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63529314-dbdf-4e7c-9a00-e2e968c8522a-kube-api-access-tx5cq" (OuterVolumeSpecName: "kube-api-access-tx5cq") pod "63529314-dbdf-4e7c-9a00-e2e968c8522a" (UID: "63529314-dbdf-4e7c-9a00-e2e968c8522a"). InnerVolumeSpecName "kube-api-access-tx5cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.163600 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63529314-dbdf-4e7c-9a00-e2e968c8522a" path="/var/lib/kubelet/pods/63529314-dbdf-4e7c-9a00-e2e968c8522a/volumes" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.227288 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tx5cq\" (UniqueName: \"kubernetes.io/projected/63529314-dbdf-4e7c-9a00-e2e968c8522a-kube-api-access-tx5cq\") on node \"crc\" DevicePath \"\"" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.851827 4907 scope.go:117] "RemoveContainer" containerID="fc96c31e9a60596ace2c05d57e08d0789967d0b7be8e8ca055b984161638cc9c" Feb 26 16:35:14 crc kubenswrapper[4907]: I0226 16:35:14.851877 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/crc-debug-lz85x" Feb 26 16:35:17 crc kubenswrapper[4907]: I0226 16:35:17.126739 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:35:17 crc kubenswrapper[4907]: E0226 16:35:17.127290 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:35:28 crc kubenswrapper[4907]: I0226 16:35:28.132033 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:35:28 crc kubenswrapper[4907]: E0226 16:35:28.133897 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:35:40 crc kubenswrapper[4907]: I0226 16:35:40.128172 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:35:40 crc kubenswrapper[4907]: E0226 16:35:40.128936 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:35:52 crc kubenswrapper[4907]: I0226 16:35:52.127235 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:35:52 crc kubenswrapper[4907]: E0226 16:35:52.127826 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.146680 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535396-br7bg"] Feb 26 16:36:00 crc kubenswrapper[4907]: E0226 16:36:00.147735 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63529314-dbdf-4e7c-9a00-e2e968c8522a" containerName="container-00" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.147754 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="63529314-dbdf-4e7c-9a00-e2e968c8522a" containerName="container-00" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.148005 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="63529314-dbdf-4e7c-9a00-e2e968c8522a" containerName="container-00" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.148836 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.150541 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.152538 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.153632 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.173619 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-br7bg"] Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.253800 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4zjw\" (UniqueName: \"kubernetes.io/projected/6c80b0c5-c510-48cf-937a-3a9f11285427-kube-api-access-w4zjw\") pod \"auto-csr-approver-29535396-br7bg\" (UID: \"6c80b0c5-c510-48cf-937a-3a9f11285427\") " pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.355661 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4zjw\" (UniqueName: \"kubernetes.io/projected/6c80b0c5-c510-48cf-937a-3a9f11285427-kube-api-access-w4zjw\") pod \"auto-csr-approver-29535396-br7bg\" (UID: \"6c80b0c5-c510-48cf-937a-3a9f11285427\") " pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.375536 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4zjw\" (UniqueName: \"kubernetes.io/projected/6c80b0c5-c510-48cf-937a-3a9f11285427-kube-api-access-w4zjw\") pod \"auto-csr-approver-29535396-br7bg\" (UID: \"6c80b0c5-c510-48cf-937a-3a9f11285427\") " pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.465773 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.935644 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-br7bg"] Feb 26 16:36:00 crc kubenswrapper[4907]: I0226 16:36:00.952099 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:36:01 crc kubenswrapper[4907]: I0226 16:36:01.239624 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535396-br7bg" event={"ID":"6c80b0c5-c510-48cf-937a-3a9f11285427","Type":"ContainerStarted","Data":"61c591f7bf0ada3d1004c4b104b64f0da3bf7d1d0fffe05f86ae61a0de0d539f"} Feb 26 16:36:02 crc kubenswrapper[4907]: E0226 16:36:02.964680 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c80b0c5_c510_48cf_937a_3a9f11285427.slice/crio-conmon-d0e560d55afcd969bf8b16d45b3e8a2d898583108a265cf0817897c4fede33d3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 16:36:03 crc kubenswrapper[4907]: I0226 16:36:03.260462 4907 generic.go:334] "Generic (PLEG): container finished" podID="6c80b0c5-c510-48cf-937a-3a9f11285427" containerID="d0e560d55afcd969bf8b16d45b3e8a2d898583108a265cf0817897c4fede33d3" exitCode=0 Feb 26 16:36:03 crc kubenswrapper[4907]: I0226 16:36:03.260532 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535396-br7bg" event={"ID":"6c80b0c5-c510-48cf-937a-3a9f11285427","Type":"ContainerDied","Data":"d0e560d55afcd969bf8b16d45b3e8a2d898583108a265cf0817897c4fede33d3"} Feb 26 16:36:04 crc kubenswrapper[4907]: I0226 16:36:04.658172 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:04 crc kubenswrapper[4907]: I0226 16:36:04.734247 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4zjw\" (UniqueName: \"kubernetes.io/projected/6c80b0c5-c510-48cf-937a-3a9f11285427-kube-api-access-w4zjw\") pod \"6c80b0c5-c510-48cf-937a-3a9f11285427\" (UID: \"6c80b0c5-c510-48cf-937a-3a9f11285427\") " Feb 26 16:36:04 crc kubenswrapper[4907]: I0226 16:36:04.746616 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c80b0c5-c510-48cf-937a-3a9f11285427-kube-api-access-w4zjw" (OuterVolumeSpecName: "kube-api-access-w4zjw") pod "6c80b0c5-c510-48cf-937a-3a9f11285427" (UID: "6c80b0c5-c510-48cf-937a-3a9f11285427"). InnerVolumeSpecName "kube-api-access-w4zjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:36:04 crc kubenswrapper[4907]: I0226 16:36:04.838038 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4zjw\" (UniqueName: \"kubernetes.io/projected/6c80b0c5-c510-48cf-937a-3a9f11285427-kube-api-access-w4zjw\") on node \"crc\" DevicePath \"\"" Feb 26 16:36:05 crc kubenswrapper[4907]: I0226 16:36:05.128378 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:36:05 crc kubenswrapper[4907]: E0226 16:36:05.128576 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:36:05 crc kubenswrapper[4907]: I0226 16:36:05.286817 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535396-br7bg" event={"ID":"6c80b0c5-c510-48cf-937a-3a9f11285427","Type":"ContainerDied","Data":"61c591f7bf0ada3d1004c4b104b64f0da3bf7d1d0fffe05f86ae61a0de0d539f"} Feb 26 16:36:05 crc kubenswrapper[4907]: I0226 16:36:05.286863 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61c591f7bf0ada3d1004c4b104b64f0da3bf7d1d0fffe05f86ae61a0de0d539f" Feb 26 16:36:05 crc kubenswrapper[4907]: I0226 16:36:05.286921 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535396-br7bg" Feb 26 16:36:05 crc kubenswrapper[4907]: I0226 16:36:05.731245 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-vhjq2"] Feb 26 16:36:05 crc kubenswrapper[4907]: I0226 16:36:05.743119 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535390-vhjq2"] Feb 26 16:36:06 crc kubenswrapper[4907]: I0226 16:36:06.136235 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e01c029-98ef-4ec1-a3ae-4697f2276293" path="/var/lib/kubelet/pods/2e01c029-98ef-4ec1-a3ae-4697f2276293/volumes" Feb 26 16:36:11 crc kubenswrapper[4907]: I0226 16:36:11.735981 4907 scope.go:117] "RemoveContainer" containerID="d70a830fb10dff28153179da02012a119d15c585609baa0a010ee82fb7ec527c" Feb 26 16:36:12 crc kubenswrapper[4907]: I0226 16:36:12.858278 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f5746579b-4xjhs_f81805f8-b496-452b-b721-2861546c9367/barbican-api/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.056951 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6f5746579b-4xjhs_f81805f8-b496-452b-b721-2861546c9367/barbican-api-log/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.171273 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f8d9cb4c8-5jdnw_a0449539-dbf4-4306-9dd9-db95f762a48a/barbican-keystone-listener/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.291075 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f8d9cb4c8-5jdnw_a0449539-dbf4-4306-9dd9-db95f762a48a/barbican-keystone-listener-log/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.396910 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d9b8ff5ff-b7kpr_a0ed716e-493d-4590-81a0-203b8618cf61/barbican-worker/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.442415 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d9b8ff5ff-b7kpr_a0ed716e-493d-4590-81a0-203b8618cf61/barbican-worker-log/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.631258 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-s8jbj_235c91d9-1679-4ab9-b8a3-87d7fd5f68cf/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.725741 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_31a81b99-dba6-4f2e-95eb-09f66cdd28df/ceilometer-central-agent/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.860530 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_31a81b99-dba6-4f2e-95eb-09f66cdd28df/proxy-httpd/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.871340 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_31a81b99-dba6-4f2e-95eb-09f66cdd28df/ceilometer-notification-agent/0.log" Feb 26 16:36:13 crc kubenswrapper[4907]: I0226 16:36:13.952781 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_31a81b99-dba6-4f2e-95eb-09f66cdd28df/sg-core/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.262456 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_193a5b34-9a06-4c8d-b3bc-53bc62485387/cinder-api-log/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.386196 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_193a5b34-9a06-4c8d-b3bc-53bc62485387/cinder-api/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.497746 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_00c049ce-b973-4246-ae47-5fb2a6789fbb/cinder-scheduler/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.503655 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_00c049ce-b973-4246-ae47-5fb2a6789fbb/probe/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.699646 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-kx4lh_f7ab7062-024c-462c-99a7-4c3f6f27e471/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.837021 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-sf9ts_e9a87b6e-5a0f-4201-b6d1-a1cd0d224361/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:14 crc kubenswrapper[4907]: I0226 16:36:14.996532 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-thjms_c0ee4ec2-b0e1-4927-9258-df237432c628/init/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.171244 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-thjms_c0ee4ec2-b0e1-4927-9258-df237432c628/init/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.276170 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-kdc4r_9c764e34-e690-4b9f-aae5-9ea7ccacd4fc/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.312436 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-67cb876dc9-thjms_c0ee4ec2-b0e1-4927-9258-df237432c628/dnsmasq-dns/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.529741 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_86460377-004c-4908-be6f-328241e8b5fb/glance-log/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.535147 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_86460377-004c-4908-be6f-328241e8b5fb/glance-httpd/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.639440 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_058a6068-cb3c-42f2-bbe5-7b4dbc71d194/glance-httpd/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.746368 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_058a6068-cb3c-42f2-bbe5-7b4dbc71d194/glance-log/0.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.930659 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76d88967b8-wmzcw_b35f87c4-e535-4901-8814-0b321b201158/horizon/2.log" Feb 26 16:36:15 crc kubenswrapper[4907]: I0226 16:36:15.966994 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76d88967b8-wmzcw_b35f87c4-e535-4901-8814-0b321b201158/horizon/1.log" Feb 26 16:36:16 crc kubenswrapper[4907]: I0226 16:36:16.230572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76d88967b8-wmzcw_b35f87c4-e535-4901-8814-0b321b201158/horizon-log/0.log" Feb 26 16:36:16 crc kubenswrapper[4907]: I0226 16:36:16.252053 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qrpg9_be614198-ac98-4ed9-926b-c1a2aa9789c5/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:16 crc kubenswrapper[4907]: I0226 16:36:16.482228 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-tmgdw_b47b4d79-5f18-4d3d-8263-21fb9b0d31b3/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:16 crc kubenswrapper[4907]: I0226 16:36:16.582082 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-86f7f47947-xzhlh_bb4b5b1f-5a7e-4bdd-a013-988c8057f16c/keystone-api/0.log" Feb 26 16:36:16 crc kubenswrapper[4907]: I0226 16:36:16.729660 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f7394cd4-d14c-450e-8865-7c7509c5021b/kube-state-metrics/0.log" Feb 26 16:36:16 crc kubenswrapper[4907]: I0226 16:36:16.912445 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-gc9ft_2ad5f1d0-06ec-4101-b484-d4e1bc3746a3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:17 crc kubenswrapper[4907]: I0226 16:36:17.244926 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6db49c6bf7-w2792_3996ac72-7ea7-4e6f-a714-1a0597f15fde/neutron-api/0.log" Feb 26 16:36:17 crc kubenswrapper[4907]: I0226 16:36:17.298824 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6db49c6bf7-w2792_3996ac72-7ea7-4e6f-a714-1a0597f15fde/neutron-httpd/0.log" Feb 26 16:36:17 crc kubenswrapper[4907]: I0226 16:36:17.579787 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-vvxgw_ae4ed9f9-3638-491a-8467-0035443468c1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:17 crc kubenswrapper[4907]: I0226 16:36:17.961503 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_674c61cb-49ef-4710-b83f-0374acf42f6a/nova-api-log/0.log" Feb 26 16:36:18 crc kubenswrapper[4907]: I0226 16:36:18.089309 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_674c61cb-49ef-4710-b83f-0374acf42f6a/nova-api-api/0.log" Feb 26 16:36:18 crc kubenswrapper[4907]: I0226 16:36:18.094941 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_ada06759-c75a-49d4-9bbc-ef11e888b457/nova-cell0-conductor-conductor/0.log" Feb 26 16:36:18 crc kubenswrapper[4907]: I0226 16:36:18.320008 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_a123e787-8e80-495d-86f2-717a9c43353c/nova-cell1-conductor-conductor/0.log" Feb 26 16:36:18 crc kubenswrapper[4907]: I0226 16:36:18.514153 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_f166b819-6d86-432a-a806-764338fb2687/nova-cell1-novncproxy-novncproxy/0.log" Feb 26 16:36:18 crc kubenswrapper[4907]: I0226 16:36:18.695660 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-klh96_16415278-d48c-47a3-92b4-0dfb2da9c8ca/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:18 crc kubenswrapper[4907]: I0226 16:36:18.932337 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_30cde741-a6c4-485b-9ff4-ee2da1ffb88c/nova-metadata-log/0.log" Feb 26 16:36:19 crc kubenswrapper[4907]: I0226 16:36:19.118936 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_c994f627-1f02-468c-9651-19ac6a8728b4/nova-scheduler-scheduler/0.log" Feb 26 16:36:19 crc kubenswrapper[4907]: I0226 16:36:19.217975 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7d7af39e-1222-4a40-a2f3-a644e2ef477d/mysql-bootstrap/0.log" Feb 26 16:36:19 crc kubenswrapper[4907]: I0226 16:36:19.465422 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7d7af39e-1222-4a40-a2f3-a644e2ef477d/mysql-bootstrap/0.log" Feb 26 16:36:19 crc kubenswrapper[4907]: I0226 16:36:19.507447 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_7d7af39e-1222-4a40-a2f3-a644e2ef477d/galera/0.log" Feb 26 16:36:19 crc kubenswrapper[4907]: I0226 16:36:19.625750 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_30cde741-a6c4-485b-9ff4-ee2da1ffb88c/nova-metadata-metadata/0.log" Feb 26 16:36:19 crc kubenswrapper[4907]: I0226 16:36:19.742524 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3fdde055-1569-4b2a-bc9f-893b93ee63b1/mysql-bootstrap/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.029797 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3fdde055-1569-4b2a-bc9f-893b93ee63b1/galera/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.045194 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3fdde055-1569-4b2a-bc9f-893b93ee63b1/mysql-bootstrap/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.127695 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:36:20 crc kubenswrapper[4907]: E0226 16:36:20.127908 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.136787 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_173e1a27-c6cc-47cf-9d1a-8e9e19fe3afa/openstackclient/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.407376 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-drng5_66d3c733-f440-4877-9e7b-af62f5dc7857/ovn-controller/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.427329 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-j84zw_928405a5-2e89-44dd-ab55-8d82ba1db8c3/openstack-network-exporter/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.682781 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qr64_ce0f1161-6251-4318-b364-7db1779f93bd/ovsdb-server-init/0.log" Feb 26 16:36:20 crc kubenswrapper[4907]: I0226 16:36:20.988501 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qr64_ce0f1161-6251-4318-b364-7db1779f93bd/ovsdb-server-init/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.012229 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qr64_ce0f1161-6251-4318-b364-7db1779f93bd/ovs-vswitchd/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.015749 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9qr64_ce0f1161-6251-4318-b364-7db1779f93bd/ovsdb-server/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.294555 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h29r8_b796cd80-c3e7-428e-a090-1569637819e8/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.379279 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ac0b04a-5f93-4033-b52a-46a47b9f3364/ovn-northd/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.414811 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0ac0b04a-5f93-4033-b52a-46a47b9f3364/openstack-network-exporter/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.604273 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a7d66633-e694-4e7e-ba21-70dc18b93cfb/openstack-network-exporter/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.782475 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a7d66633-e694-4e7e-ba21-70dc18b93cfb/ovsdbserver-nb/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.853847 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2/openstack-network-exporter/0.log" Feb 26 16:36:21 crc kubenswrapper[4907]: I0226 16:36:21.978500 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6c669352-7f94-4a3c-bf6c-a84f7bf2e5e2/ovsdbserver-sb/0.log" Feb 26 16:36:22 crc kubenswrapper[4907]: I0226 16:36:22.169717 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c4f4876c6-sk5mm_85da2141-e440-4d43-8f34-47c130cedfe3/placement-api/0.log" Feb 26 16:36:22 crc kubenswrapper[4907]: I0226 16:36:22.173201 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7c4f4876c6-sk5mm_85da2141-e440-4d43-8f34-47c130cedfe3/placement-log/0.log" Feb 26 16:36:22 crc kubenswrapper[4907]: I0226 16:36:22.376183 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cbc69627-1691-43df-a77a-ca3e26e67aaa/setup-container/0.log" Feb 26 16:36:22 crc kubenswrapper[4907]: I0226 16:36:22.699358 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20078d55-ee5c-4818-9ff9-4089683c9729/setup-container/0.log" Feb 26 16:36:22 crc kubenswrapper[4907]: I0226 16:36:22.743396 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cbc69627-1691-43df-a77a-ca3e26e67aaa/setup-container/0.log" Feb 26 16:36:22 crc kubenswrapper[4907]: I0226 16:36:22.852189 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_cbc69627-1691-43df-a77a-ca3e26e67aaa/rabbitmq/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.025904 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20078d55-ee5c-4818-9ff9-4089683c9729/rabbitmq/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.027902 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_20078d55-ee5c-4818-9ff9-4089683c9729/setup-container/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.203902 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8qvs2_348e7351-416b-4791-b202-46ce193e0c6e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.320780 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-nzx6v_744e4551-7f1b-4a7e-a907-2e2fd05053e1/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.539627 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-cjbr8_47906d66-a8ce-445d-a71c-63f5bcfb6902/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.574363 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8fgwz_bc0bd13e-cad0-4a21-856b-aaf97d65cec2/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.767034 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-skfk6_4ba81fad-7677-4ea8-b338-09ef7f73f63b/ssh-known-hosts-edpm-deployment/0.log" Feb 26 16:36:23 crc kubenswrapper[4907]: I0226 16:36:23.965829 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58d5d7785f-4fcrq_80cd7152-934f-40c6-925c-a3f1f9dfca95/proxy-server/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.167524 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-58d5d7785f-4fcrq_80cd7152-934f-40c6-925c-a3f1f9dfca95/proxy-httpd/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.196340 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-zj4xn_c5f9c74c-c90c-40ba-9548-dc79f90592a4/swift-ring-rebalance/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.379036 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/account-auditor/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.383039 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/account-reaper/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.511342 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/account-replicator/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.581536 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/account-server/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.664538 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/container-auditor/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.678912 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/container-replicator/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.807985 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/container-updater/0.log" Feb 26 16:36:24 crc kubenswrapper[4907]: I0226 16:36:24.880210 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/container-server/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.013167 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/object-expirer/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.032614 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/object-replicator/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.099366 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/object-auditor/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.208457 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/object-server/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.267891 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/object-updater/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.339716 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/rsync/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.369434 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_819c7fec-fd22-478a-bf6c-f4cb5aeccc59/swift-recon-cron/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.705786 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4wnq2_2483c310-db88-4757-857d-91e2815bbe67/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.749194 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b09f4f4d-8644-4923-ab26-849b249efd4e/tempest-tests-tempest-tests-runner/0.log" Feb 26 16:36:25 crc kubenswrapper[4907]: I0226 16:36:25.865546 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e10c0113-d917-4c4a-be56-0e234e61e744/test-operator-logs-container/0.log" Feb 26 16:36:26 crc kubenswrapper[4907]: I0226 16:36:26.085777 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-blwb6_93c0beb2-fc90-42f2-b4dd-f0f043cc0ede/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 16:36:28 crc kubenswrapper[4907]: I0226 16:36:28.589689 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_964032de-099d-4e22-95d5-d7acf78c5685/memcached/0.log" Feb 26 16:36:31 crc kubenswrapper[4907]: I0226 16:36:31.126781 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:36:31 crc kubenswrapper[4907]: E0226 16:36:31.127200 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:36:42 crc kubenswrapper[4907]: I0226 16:36:42.127324 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:36:42 crc kubenswrapper[4907]: E0226 16:36:42.128120 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:36:51 crc kubenswrapper[4907]: I0226 16:36:51.896879 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-55cc45767f-nxx6j_41934925-b8e2-4927-a9a6-07defdda378c/manager/0.log" Feb 26 16:36:52 crc kubenswrapper[4907]: I0226 16:36:52.188659 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/util/0.log" Feb 26 16:36:52 crc kubenswrapper[4907]: I0226 16:36:52.427564 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/pull/0.log" Feb 26 16:36:52 crc kubenswrapper[4907]: I0226 16:36:52.472894 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/util/0.log" Feb 26 16:36:52 crc kubenswrapper[4907]: I0226 16:36:52.665967 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/pull/0.log" Feb 26 16:36:52 crc kubenswrapper[4907]: I0226 16:36:52.933668 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/pull/0.log" Feb 26 16:36:52 crc kubenswrapper[4907]: I0226 16:36:52.987986 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-768c8b45bb-k4hzr_a9988ddc-f970-4dac-bcd0-92266f0c7494/manager/0.log" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.004975 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/util/0.log" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.126599 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.207235 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ffa8797d5133dab2efd18584e0d5d66ef5b78c6ec08e68212a6e3345c6rw6x4_8a2e47e7-4347-4860-8c91-5a2b12ae1066/extract/0.log" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.392955 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7f748f8b74-q55xl_e57bde5d-eca0-458a-af67-2f45ce85c54f/manager/0.log" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.569572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-9595d6797-m4jb4_44c123c9-ac46-4afe-b6d8-773f70ecc033/manager/0.log" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.741061 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-54fb488b88-6hchw_ac6b0a27-6eaf-4d88-af65-94c64180c950/manager/0.log" Feb 26 16:36:53 crc kubenswrapper[4907]: I0226 16:36:53.753637 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"d43bc521831c88457b494ef539cb7ec24221ab1999bc5d1f490d67f1fd00bc95"} Feb 26 16:36:54 crc kubenswrapper[4907]: I0226 16:36:54.226134 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6494cdbf8f-2r2t2_142a17bc-42dd-41ab-a97c-21350948ca5d/manager/0.log" Feb 26 16:36:54 crc kubenswrapper[4907]: I0226 16:36:54.649940 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-6c78d668d5-245bf_f7c1fe7a-3983-49ff-bcde-36338aadc657/manager/0.log" Feb 26 16:36:54 crc kubenswrapper[4907]: I0226 16:36:54.759692 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-66d6b5f488-g7cb4_13df9f9f-0740-41d3-b193-0517c76d2830/manager/0.log" Feb 26 16:36:55 crc kubenswrapper[4907]: I0226 16:36:55.040401 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-76fd76856-pk8zs_9fb09a9c-025a-4bc0-81a0-c127fee3f6f3/manager/0.log" Feb 26 16:36:55 crc kubenswrapper[4907]: I0226 16:36:55.188466 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6dc9b6ff89-vtc25_3c5efb12-7704-4d2a-9ea6-aa35436391ae/manager/0.log" Feb 26 16:36:55 crc kubenswrapper[4907]: I0226 16:36:55.493854 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54967dbbdf-24rjt_25c72e04-6714-4c5b-a273-a21a1415c4ac/manager/0.log" Feb 26 16:36:55 crc kubenswrapper[4907]: I0226 16:36:55.738981 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d56fd956f-6znnd_9a69dc6a-4034-4e7d-8b6f-576ccd828cf6/manager/0.log" Feb 26 16:36:55 crc kubenswrapper[4907]: I0226 16:36:55.937715 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-77b8b67585-x8222_5dac7dc1-cf0e-4962-956e-800b57e369e1/manager/0.log" Feb 26 16:36:56 crc kubenswrapper[4907]: I0226 16:36:56.139383 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c5677dc5d-mmllt_1bcfd62b-212e-4efc-b0be-f0542e186f07/manager/0.log" Feb 26 16:36:56 crc kubenswrapper[4907]: I0226 16:36:56.502166 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-66fc5dfc5b-4l68j_76bf7541-fa3f-471d-8a14-99300afab6c1/operator/0.log" Feb 26 16:36:56 crc kubenswrapper[4907]: I0226 16:36:56.771499 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ldb8j_7a83fde4-3660-4aa5-8bdd-ad32bfcc704c/registry-server/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.077243 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-85c99d655-t27pd_51842918-6f0f-4599-b288-84c75e4390ad/manager/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.233354 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-57bd55f9b7-mxbcg_1bba2156-1275-4aa3-8eba-3ce7c3c85d72/manager/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.347865 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-psxq9_876cfd39-7856-438c-923e-1eb89fae62b0/operator/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.490658 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-79558bbfbf-g2mlp_311f46b9-23bf-49b6-a2a5-919c8e42c62a/manager/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.587002 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-c4b7d6946-58hjs_7fc27253-f8a7-4b6c-b83a-d32cdadb162d/manager/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.864116 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-56dc67d744-w7qpb_2c9290e8-c587-48aa-8ea2-66b772c9341c/manager/0.log" Feb 26 16:36:57 crc kubenswrapper[4907]: I0226 16:36:57.969287 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-8467ccb4c8-r6ndg_872f261b-cbf5-47b6-99ce-ee5c0d9794a3/manager/0.log" Feb 26 16:36:58 crc kubenswrapper[4907]: I0226 16:36:58.060726 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c89c59655-dbrxn_e8f0195b-740f-4219-a422-9b99f2841ee5/manager/0.log" Feb 26 16:36:58 crc kubenswrapper[4907]: I0226 16:36:58.123216 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76bcb69745-v2z8v_edeb6783-da9a-4f17-8ebe-e234aeeb35fd/manager/0.log" Feb 26 16:37:18 crc kubenswrapper[4907]: I0226 16:37:18.678376 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-w9nx4_af8aa9df-432b-40bd-847c-c3539b32cb59/control-plane-machine-set-operator/0.log" Feb 26 16:37:19 crc kubenswrapper[4907]: I0226 16:37:19.063057 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hdqkt_489d8c16-01bf-466b-a863-a3c8594d8b88/kube-rbac-proxy/0.log" Feb 26 16:37:19 crc kubenswrapper[4907]: I0226 16:37:19.124355 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hdqkt_489d8c16-01bf-466b-a863-a3c8594d8b88/machine-api-operator/0.log" Feb 26 16:37:32 crc kubenswrapper[4907]: I0226 16:37:32.596935 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-v6vbf_1e1d1a02-d13e-4410-8762-ffa52da94db0/cert-manager-controller/0.log" Feb 26 16:37:32 crc kubenswrapper[4907]: I0226 16:37:32.814665 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8lnvq_2a995506-4e43-40d2-8e85-720648605979/cert-manager-cainjector/0.log" Feb 26 16:37:32 crc kubenswrapper[4907]: I0226 16:37:32.917831 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hdhr9_177f40d7-0ed3-43d9-b8db-148511ab9065/cert-manager-webhook/0.log" Feb 26 16:37:46 crc kubenswrapper[4907]: I0226 16:37:46.895969 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-85mn5_c973ae22-7363-4e9d-abbe-a519875d412c/nmstate-console-plugin/0.log" Feb 26 16:37:47 crc kubenswrapper[4907]: I0226 16:37:47.025326 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-5zh9k_3b24ed4b-d8ad-40c5-8b97-1a23a9fd8097/nmstate-handler/0.log" Feb 26 16:37:47 crc kubenswrapper[4907]: I0226 16:37:47.121799 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-fpz9z_ad43a6fa-206d-43e4-8364-7902ff853e8c/kube-rbac-proxy/0.log" Feb 26 16:37:47 crc kubenswrapper[4907]: I0226 16:37:47.177915 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-fpz9z_ad43a6fa-206d-43e4-8364-7902ff853e8c/nmstate-metrics/0.log" Feb 26 16:37:47 crc kubenswrapper[4907]: I0226 16:37:47.327321 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-hpw5x_383794c0-581b-4b48-bf74-876cfe097c2e/nmstate-operator/0.log" Feb 26 16:37:47 crc kubenswrapper[4907]: I0226 16:37:47.392239 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-4cghj_aae13e12-a0b1-40c1-bdd6-844b790cb79c/nmstate-webhook/0.log" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.197441 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535398-ddr4l"] Feb 26 16:38:00 crc kubenswrapper[4907]: E0226 16:38:00.198459 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c80b0c5-c510-48cf-937a-3a9f11285427" containerName="oc" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.198473 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c80b0c5-c510-48cf-937a-3a9f11285427" containerName="oc" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.198711 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c80b0c5-c510-48cf-937a-3a9f11285427" containerName="oc" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.199344 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.203221 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.203458 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.203747 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.220277 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-ddr4l"] Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.269787 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqrbv\" (UniqueName: \"kubernetes.io/projected/ebfcda1d-29af-401c-89dc-e06545c4f276-kube-api-access-vqrbv\") pod \"auto-csr-approver-29535398-ddr4l\" (UID: \"ebfcda1d-29af-401c-89dc-e06545c4f276\") " pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.374843 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqrbv\" (UniqueName: \"kubernetes.io/projected/ebfcda1d-29af-401c-89dc-e06545c4f276-kube-api-access-vqrbv\") pod \"auto-csr-approver-29535398-ddr4l\" (UID: \"ebfcda1d-29af-401c-89dc-e06545c4f276\") " pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.404653 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqrbv\" (UniqueName: \"kubernetes.io/projected/ebfcda1d-29af-401c-89dc-e06545c4f276-kube-api-access-vqrbv\") pod \"auto-csr-approver-29535398-ddr4l\" (UID: \"ebfcda1d-29af-401c-89dc-e06545c4f276\") " pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:00 crc kubenswrapper[4907]: I0226 16:38:00.530664 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:01 crc kubenswrapper[4907]: I0226 16:38:01.059793 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-ddr4l"] Feb 26 16:38:01 crc kubenswrapper[4907]: I0226 16:38:01.425981 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" event={"ID":"ebfcda1d-29af-401c-89dc-e06545c4f276","Type":"ContainerStarted","Data":"4888b22552192f1cd67a661439af23034ffb8ae4d180bbd64b844e6e48b09f02"} Feb 26 16:38:03 crc kubenswrapper[4907]: I0226 16:38:03.444771 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" event={"ID":"ebfcda1d-29af-401c-89dc-e06545c4f276","Type":"ContainerStarted","Data":"e17a64c7afd1f8e9c2aa95314db83af3c5c6827a9651cef30f5e4b7ef032b28a"} Feb 26 16:38:03 crc kubenswrapper[4907]: I0226 16:38:03.475580 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" podStartSLOduration=2.191090525 podStartE2EDuration="3.475558244s" podCreationTimestamp="2026-02-26 16:38:00 +0000 UTC" firstStartedPulling="2026-02-26 16:38:01.075908463 +0000 UTC m=+3343.594470312" lastFinishedPulling="2026-02-26 16:38:02.360376182 +0000 UTC m=+3344.878938031" observedRunningTime="2026-02-26 16:38:03.465007197 +0000 UTC m=+3345.983569056" watchObservedRunningTime="2026-02-26 16:38:03.475558244 +0000 UTC m=+3345.994120093" Feb 26 16:38:04 crc kubenswrapper[4907]: I0226 16:38:04.457624 4907 generic.go:334] "Generic (PLEG): container finished" podID="ebfcda1d-29af-401c-89dc-e06545c4f276" containerID="e17a64c7afd1f8e9c2aa95314db83af3c5c6827a9651cef30f5e4b7ef032b28a" exitCode=0 Feb 26 16:38:04 crc kubenswrapper[4907]: I0226 16:38:04.458372 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" event={"ID":"ebfcda1d-29af-401c-89dc-e06545c4f276","Type":"ContainerDied","Data":"e17a64c7afd1f8e9c2aa95314db83af3c5c6827a9651cef30f5e4b7ef032b28a"} Feb 26 16:38:05 crc kubenswrapper[4907]: I0226 16:38:05.804971 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:05 crc kubenswrapper[4907]: I0226 16:38:05.881183 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqrbv\" (UniqueName: \"kubernetes.io/projected/ebfcda1d-29af-401c-89dc-e06545c4f276-kube-api-access-vqrbv\") pod \"ebfcda1d-29af-401c-89dc-e06545c4f276\" (UID: \"ebfcda1d-29af-401c-89dc-e06545c4f276\") " Feb 26 16:38:05 crc kubenswrapper[4907]: I0226 16:38:05.894916 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebfcda1d-29af-401c-89dc-e06545c4f276-kube-api-access-vqrbv" (OuterVolumeSpecName: "kube-api-access-vqrbv") pod "ebfcda1d-29af-401c-89dc-e06545c4f276" (UID: "ebfcda1d-29af-401c-89dc-e06545c4f276"). InnerVolumeSpecName "kube-api-access-vqrbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:38:05 crc kubenswrapper[4907]: I0226 16:38:05.986059 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqrbv\" (UniqueName: \"kubernetes.io/projected/ebfcda1d-29af-401c-89dc-e06545c4f276-kube-api-access-vqrbv\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:06 crc kubenswrapper[4907]: E0226 16:38:06.265261 4907 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebfcda1d_29af_401c_89dc_e06545c4f276.slice/crio-4888b22552192f1cd67a661439af23034ffb8ae4d180bbd64b844e6e48b09f02\": RecentStats: unable to find data in memory cache]" Feb 26 16:38:06 crc kubenswrapper[4907]: I0226 16:38:06.475379 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" event={"ID":"ebfcda1d-29af-401c-89dc-e06545c4f276","Type":"ContainerDied","Data":"4888b22552192f1cd67a661439af23034ffb8ae4d180bbd64b844e6e48b09f02"} Feb 26 16:38:06 crc kubenswrapper[4907]: I0226 16:38:06.475679 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4888b22552192f1cd67a661439af23034ffb8ae4d180bbd64b844e6e48b09f02" Feb 26 16:38:06 crc kubenswrapper[4907]: I0226 16:38:06.475419 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535398-ddr4l" Feb 26 16:38:06 crc kubenswrapper[4907]: I0226 16:38:06.529770 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-5jtj5"] Feb 26 16:38:06 crc kubenswrapper[4907]: I0226 16:38:06.536656 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535392-5jtj5"] Feb 26 16:38:08 crc kubenswrapper[4907]: I0226 16:38:08.139962 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cff0efd2-1485-42a4-9fe1-b27fc59c3322" path="/var/lib/kubelet/pods/cff0efd2-1485-42a4-9fe1-b27fc59c3322/volumes" Feb 26 16:38:11 crc kubenswrapper[4907]: I0226 16:38:11.852995 4907 scope.go:117] "RemoveContainer" containerID="72d1527220d87a01caf0e776ae54b1d090953c76745f6cccf52c73f47d1e0ba8" Feb 26 16:38:17 crc kubenswrapper[4907]: I0226 16:38:17.895602 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-qwvw9_05e6312b-9683-44bf-9368-cb234744fd33/kube-rbac-proxy/0.log" Feb 26 16:38:17 crc kubenswrapper[4907]: I0226 16:38:17.988952 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-qwvw9_05e6312b-9683-44bf-9368-cb234744fd33/controller/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.150058 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-frr-files/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.331025 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-reloader/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.402081 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-reloader/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.415750 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-metrics/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.418192 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-frr-files/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.580198 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-frr-files/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.621669 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-metrics/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.649241 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-reloader/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.666572 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-metrics/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.812791 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-reloader/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.814379 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-metrics/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.845663 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/cp-frr-files/0.log" Feb 26 16:38:18 crc kubenswrapper[4907]: I0226 16:38:18.863108 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/controller/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.028018 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/frr-metrics/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.089069 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/kube-rbac-proxy/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.169905 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/kube-rbac-proxy-frr/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.330250 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/reloader/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.427171 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-8kcg5_e3fa6e66-60dc-44b8-a6a6-47a7ec18424f/frr-k8s-webhook-server/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.712907 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-569bd5c9fd-jkl8q_93c2e5d2-ce7d-47db-a76b-85f1988e1864/manager/0.log" Feb 26 16:38:19 crc kubenswrapper[4907]: I0226 16:38:19.913164 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5448c47665-smhgw_04570909-662d-4a9e-9f62-fbca4b92bfa7/webhook-server/0.log" Feb 26 16:38:20 crc kubenswrapper[4907]: I0226 16:38:20.061071 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7hcct_b4841c1c-c56d-4abe-b6a7-92211b5c4a19/kube-rbac-proxy/0.log" Feb 26 16:38:20 crc kubenswrapper[4907]: I0226 16:38:20.327330 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-2kml2_aedab463-da2b-4bf1-a67d-16439f225983/frr/0.log" Feb 26 16:38:20 crc kubenswrapper[4907]: I0226 16:38:20.546082 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7hcct_b4841c1c-c56d-4abe-b6a7-92211b5c4a19/speaker/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.517698 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/util/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.517822 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/util/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.519512 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/pull/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.519764 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/pull/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.765670 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/util/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.791368 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/pull/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.862242 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a828zxwh_d9bc1ab0-f219-4ba0-adc8-07a7167bbaa0/extract/0.log" Feb 26 16:38:35 crc kubenswrapper[4907]: I0226 16:38:35.987785 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/extract-utilities/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.185878 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/extract-utilities/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.281146 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/extract-content/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.295480 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/extract-content/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.444283 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/extract-content/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.450498 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/extract-utilities/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.816805 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xttzz_df2526da-5738-4040-afe3-6019b50203ae/registry-server/0.log" Feb 26 16:38:36 crc kubenswrapper[4907]: I0226 16:38:36.838467 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/extract-utilities/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.098674 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/extract-content/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.109047 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/extract-utilities/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.128882 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/extract-content/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.592767 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/extract-utilities/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.656243 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/extract-content/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.773824 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-xbpfd_12fc0143-8c96-4837-99ce-f5b7e447f10b/registry-server/0.log" Feb 26 16:38:37 crc kubenswrapper[4907]: I0226 16:38:37.831625 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/util/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.090507 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/pull/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.090835 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/pull/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.144021 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/util/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.360840 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/pull/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.405379 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/extract/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.464884 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4ss77t_dea9effb-0863-442e-85b0-ac5bade13bdb/util/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.564397 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-svjkc_77a34fa8-40ba-4944-bd27-03a9a4f7761f/marketplace-operator/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.664973 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/extract-utilities/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.872848 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/extract-content/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.879435 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/extract-utilities/0.log" Feb 26 16:38:38 crc kubenswrapper[4907]: I0226 16:38:38.916385 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/extract-content/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.154910 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/extract-content/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.194243 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/extract-utilities/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.302695 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vcgkv_012fe452-e0b5-4248-a110-8bf778e9595d/registry-server/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.400892 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/extract-utilities/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.623995 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/extract-content/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.679807 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/extract-utilities/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.692135 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/extract-content/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.870615 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/extract-content/0.log" Feb 26 16:38:39 crc kubenswrapper[4907]: I0226 16:38:39.899168 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/extract-utilities/0.log" Feb 26 16:38:40 crc kubenswrapper[4907]: I0226 16:38:40.249314 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mr9bt_01b123c0-d91d-4bed-8fd2-7931cbca4acb/registry-server/0.log" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.468345 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4dw5g"] Feb 26 16:38:41 crc kubenswrapper[4907]: E0226 16:38:41.468820 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebfcda1d-29af-401c-89dc-e06545c4f276" containerName="oc" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.468838 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebfcda1d-29af-401c-89dc-e06545c4f276" containerName="oc" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.469058 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebfcda1d-29af-401c-89dc-e06545c4f276" containerName="oc" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.473490 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.485530 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4dw5g"] Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.594016 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfvh6\" (UniqueName: \"kubernetes.io/projected/6637eead-44bc-4b70-9208-c1b662c778d0-kube-api-access-sfvh6\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.594351 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-catalog-content\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.594699 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-utilities\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.696515 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfvh6\" (UniqueName: \"kubernetes.io/projected/6637eead-44bc-4b70-9208-c1b662c778d0-kube-api-access-sfvh6\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.696672 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-catalog-content\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.696775 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-utilities\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.697275 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-catalog-content\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.697348 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-utilities\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.725933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfvh6\" (UniqueName: \"kubernetes.io/projected/6637eead-44bc-4b70-9208-c1b662c778d0-kube-api-access-sfvh6\") pod \"community-operators-4dw5g\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:41 crc kubenswrapper[4907]: I0226 16:38:41.838264 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:42 crc kubenswrapper[4907]: I0226 16:38:42.235671 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4dw5g"] Feb 26 16:38:42 crc kubenswrapper[4907]: I0226 16:38:42.780302 4907 generic.go:334] "Generic (PLEG): container finished" podID="6637eead-44bc-4b70-9208-c1b662c778d0" containerID="7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6" exitCode=0 Feb 26 16:38:42 crc kubenswrapper[4907]: I0226 16:38:42.781410 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerDied","Data":"7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6"} Feb 26 16:38:42 crc kubenswrapper[4907]: I0226 16:38:42.781507 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerStarted","Data":"082ef587ca034f28fc8a5571da391af5c9bc384c0f5dd3bd049ccfc2bf3c3e75"} Feb 26 16:38:43 crc kubenswrapper[4907]: I0226 16:38:43.790847 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerStarted","Data":"00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9"} Feb 26 16:38:45 crc kubenswrapper[4907]: I0226 16:38:45.812799 4907 generic.go:334] "Generic (PLEG): container finished" podID="6637eead-44bc-4b70-9208-c1b662c778d0" containerID="00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9" exitCode=0 Feb 26 16:38:45 crc kubenswrapper[4907]: I0226 16:38:45.812907 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerDied","Data":"00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9"} Feb 26 16:38:46 crc kubenswrapper[4907]: I0226 16:38:46.821913 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerStarted","Data":"dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f"} Feb 26 16:38:46 crc kubenswrapper[4907]: I0226 16:38:46.849164 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4dw5g" podStartSLOduration=2.397298647 podStartE2EDuration="5.849145251s" podCreationTimestamp="2026-02-26 16:38:41 +0000 UTC" firstStartedPulling="2026-02-26 16:38:42.783078917 +0000 UTC m=+3385.301640786" lastFinishedPulling="2026-02-26 16:38:46.234925541 +0000 UTC m=+3388.753487390" observedRunningTime="2026-02-26 16:38:46.841683409 +0000 UTC m=+3389.360245268" watchObservedRunningTime="2026-02-26 16:38:46.849145251 +0000 UTC m=+3389.367707100" Feb 26 16:38:51 crc kubenswrapper[4907]: I0226 16:38:51.838821 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:51 crc kubenswrapper[4907]: I0226 16:38:51.840576 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:51 crc kubenswrapper[4907]: I0226 16:38:51.911630 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:51 crc kubenswrapper[4907]: I0226 16:38:51.969664 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:52 crc kubenswrapper[4907]: I0226 16:38:52.857386 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4dw5g"] Feb 26 16:38:53 crc kubenswrapper[4907]: I0226 16:38:53.879913 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4dw5g" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="registry-server" containerID="cri-o://dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f" gracePeriod=2 Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.354721 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.359066 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-catalog-content\") pod \"6637eead-44bc-4b70-9208-c1b662c778d0\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.359146 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-utilities\") pod \"6637eead-44bc-4b70-9208-c1b662c778d0\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.359206 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfvh6\" (UniqueName: \"kubernetes.io/projected/6637eead-44bc-4b70-9208-c1b662c778d0-kube-api-access-sfvh6\") pod \"6637eead-44bc-4b70-9208-c1b662c778d0\" (UID: \"6637eead-44bc-4b70-9208-c1b662c778d0\") " Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.360243 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-utilities" (OuterVolumeSpecName: "utilities") pod "6637eead-44bc-4b70-9208-c1b662c778d0" (UID: "6637eead-44bc-4b70-9208-c1b662c778d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.369806 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6637eead-44bc-4b70-9208-c1b662c778d0-kube-api-access-sfvh6" (OuterVolumeSpecName: "kube-api-access-sfvh6") pod "6637eead-44bc-4b70-9208-c1b662c778d0" (UID: "6637eead-44bc-4b70-9208-c1b662c778d0"). InnerVolumeSpecName "kube-api-access-sfvh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.427206 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6637eead-44bc-4b70-9208-c1b662c778d0" (UID: "6637eead-44bc-4b70-9208-c1b662c778d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.461921 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.461955 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6637eead-44bc-4b70-9208-c1b662c778d0-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.461967 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfvh6\" (UniqueName: \"kubernetes.io/projected/6637eead-44bc-4b70-9208-c1b662c778d0-kube-api-access-sfvh6\") on node \"crc\" DevicePath \"\"" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.889060 4907 generic.go:334] "Generic (PLEG): container finished" podID="6637eead-44bc-4b70-9208-c1b662c778d0" containerID="dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f" exitCode=0 Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.889098 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerDied","Data":"dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f"} Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.889122 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4dw5g" event={"ID":"6637eead-44bc-4b70-9208-c1b662c778d0","Type":"ContainerDied","Data":"082ef587ca034f28fc8a5571da391af5c9bc384c0f5dd3bd049ccfc2bf3c3e75"} Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.889139 4907 scope.go:117] "RemoveContainer" containerID="dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.889255 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4dw5g" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.928248 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4dw5g"] Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.930978 4907 scope.go:117] "RemoveContainer" containerID="00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9" Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.943935 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4dw5g"] Feb 26 16:38:54 crc kubenswrapper[4907]: I0226 16:38:54.958015 4907 scope.go:117] "RemoveContainer" containerID="7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6" Feb 26 16:38:55 crc kubenswrapper[4907]: I0226 16:38:55.008394 4907 scope.go:117] "RemoveContainer" containerID="dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f" Feb 26 16:38:55 crc kubenswrapper[4907]: E0226 16:38:55.008978 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f\": container with ID starting with dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f not found: ID does not exist" containerID="dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f" Feb 26 16:38:55 crc kubenswrapper[4907]: I0226 16:38:55.009034 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f"} err="failed to get container status \"dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f\": rpc error: code = NotFound desc = could not find container \"dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f\": container with ID starting with dfe59779dac361db583e1e89da2b3aea0e257fa479be28ab5050979e1ac8be6f not found: ID does not exist" Feb 26 16:38:55 crc kubenswrapper[4907]: I0226 16:38:55.009061 4907 scope.go:117] "RemoveContainer" containerID="00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9" Feb 26 16:38:55 crc kubenswrapper[4907]: E0226 16:38:55.009457 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9\": container with ID starting with 00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9 not found: ID does not exist" containerID="00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9" Feb 26 16:38:55 crc kubenswrapper[4907]: I0226 16:38:55.009478 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9"} err="failed to get container status \"00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9\": rpc error: code = NotFound desc = could not find container \"00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9\": container with ID starting with 00e19fb20cda7a87fa4dd118805a97fa44887c0fd2de746cbfa6677a1d8b47e9 not found: ID does not exist" Feb 26 16:38:55 crc kubenswrapper[4907]: I0226 16:38:55.009489 4907 scope.go:117] "RemoveContainer" containerID="7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6" Feb 26 16:38:55 crc kubenswrapper[4907]: E0226 16:38:55.009803 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6\": container with ID starting with 7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6 not found: ID does not exist" containerID="7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6" Feb 26 16:38:55 crc kubenswrapper[4907]: I0226 16:38:55.009824 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6"} err="failed to get container status \"7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6\": rpc error: code = NotFound desc = could not find container \"7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6\": container with ID starting with 7a2ffeb79089ad8a57dd061996b64ce472dfb32cfd3ae3561cfa07fbcf9052d6 not found: ID does not exist" Feb 26 16:38:56 crc kubenswrapper[4907]: I0226 16:38:56.139527 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" path="/var/lib/kubelet/pods/6637eead-44bc-4b70-9208-c1b662c778d0/volumes" Feb 26 16:39:18 crc kubenswrapper[4907]: I0226 16:39:18.530238 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:39:18 crc kubenswrapper[4907]: I0226 16:39:18.531937 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:39:48 crc kubenswrapper[4907]: I0226 16:39:48.530915 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:39:48 crc kubenswrapper[4907]: I0226 16:39:48.531669 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.154962 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535400-xkwf5"] Feb 26 16:40:00 crc kubenswrapper[4907]: E0226 16:40:00.157299 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.157412 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[4907]: E0226 16:40:00.157497 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.157572 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="extract-content" Feb 26 16:40:00 crc kubenswrapper[4907]: E0226 16:40:00.157786 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.157872 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="extract-utilities" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.158200 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="6637eead-44bc-4b70-9208-c1b662c778d0" containerName="registry-server" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.159107 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.162606 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.167071 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535400-xkwf5"] Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.170965 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.171371 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.251537 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gp55z\" (UniqueName: \"kubernetes.io/projected/e1e8d410-0df6-4bfb-a852-64cd2431be07-kube-api-access-gp55z\") pod \"auto-csr-approver-29535400-xkwf5\" (UID: \"e1e8d410-0df6-4bfb-a852-64cd2431be07\") " pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.353772 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gp55z\" (UniqueName: \"kubernetes.io/projected/e1e8d410-0df6-4bfb-a852-64cd2431be07-kube-api-access-gp55z\") pod \"auto-csr-approver-29535400-xkwf5\" (UID: \"e1e8d410-0df6-4bfb-a852-64cd2431be07\") " pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.373734 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gp55z\" (UniqueName: \"kubernetes.io/projected/e1e8d410-0df6-4bfb-a852-64cd2431be07-kube-api-access-gp55z\") pod \"auto-csr-approver-29535400-xkwf5\" (UID: \"e1e8d410-0df6-4bfb-a852-64cd2431be07\") " pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:00 crc kubenswrapper[4907]: I0226 16:40:00.488671 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:01 crc kubenswrapper[4907]: I0226 16:40:01.071305 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535400-xkwf5"] Feb 26 16:40:01 crc kubenswrapper[4907]: I0226 16:40:01.467701 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" event={"ID":"e1e8d410-0df6-4bfb-a852-64cd2431be07","Type":"ContainerStarted","Data":"3358c6e188caa90a88f7323e06c180a936b848b46d2540a6bd0efa820cf8f4f9"} Feb 26 16:40:03 crc kubenswrapper[4907]: I0226 16:40:03.493488 4907 generic.go:334] "Generic (PLEG): container finished" podID="e1e8d410-0df6-4bfb-a852-64cd2431be07" containerID="a1a9d30bf2c7d2a902e4d241e1d8fa762c857fa008375bdb265b8ff2427be989" exitCode=0 Feb 26 16:40:03 crc kubenswrapper[4907]: I0226 16:40:03.493949 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" event={"ID":"e1e8d410-0df6-4bfb-a852-64cd2431be07","Type":"ContainerDied","Data":"a1a9d30bf2c7d2a902e4d241e1d8fa762c857fa008375bdb265b8ff2427be989"} Feb 26 16:40:04 crc kubenswrapper[4907]: I0226 16:40:04.861173 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:04 crc kubenswrapper[4907]: I0226 16:40:04.962081 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gp55z\" (UniqueName: \"kubernetes.io/projected/e1e8d410-0df6-4bfb-a852-64cd2431be07-kube-api-access-gp55z\") pod \"e1e8d410-0df6-4bfb-a852-64cd2431be07\" (UID: \"e1e8d410-0df6-4bfb-a852-64cd2431be07\") " Feb 26 16:40:04 crc kubenswrapper[4907]: I0226 16:40:04.971757 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e8d410-0df6-4bfb-a852-64cd2431be07-kube-api-access-gp55z" (OuterVolumeSpecName: "kube-api-access-gp55z") pod "e1e8d410-0df6-4bfb-a852-64cd2431be07" (UID: "e1e8d410-0df6-4bfb-a852-64cd2431be07"). InnerVolumeSpecName "kube-api-access-gp55z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:40:05 crc kubenswrapper[4907]: I0226 16:40:05.064543 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gp55z\" (UniqueName: \"kubernetes.io/projected/e1e8d410-0df6-4bfb-a852-64cd2431be07-kube-api-access-gp55z\") on node \"crc\" DevicePath \"\"" Feb 26 16:40:05 crc kubenswrapper[4907]: I0226 16:40:05.519662 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" event={"ID":"e1e8d410-0df6-4bfb-a852-64cd2431be07","Type":"ContainerDied","Data":"3358c6e188caa90a88f7323e06c180a936b848b46d2540a6bd0efa820cf8f4f9"} Feb 26 16:40:05 crc kubenswrapper[4907]: I0226 16:40:05.519717 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3358c6e188caa90a88f7323e06c180a936b848b46d2540a6bd0efa820cf8f4f9" Feb 26 16:40:05 crc kubenswrapper[4907]: I0226 16:40:05.519796 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535400-xkwf5" Feb 26 16:40:05 crc kubenswrapper[4907]: I0226 16:40:05.947432 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-bp54j"] Feb 26 16:40:05 crc kubenswrapper[4907]: I0226 16:40:05.958121 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535394-bp54j"] Feb 26 16:40:06 crc kubenswrapper[4907]: I0226 16:40:06.169190 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfe9cc80-ce34-4697-863f-3da3548bfc20" path="/var/lib/kubelet/pods/bfe9cc80-ce34-4697-863f-3da3548bfc20/volumes" Feb 26 16:40:11 crc kubenswrapper[4907]: I0226 16:40:11.970825 4907 scope.go:117] "RemoveContainer" containerID="4eae3e31eff6bee3afe11499d6c56af76772ddfd3e6af4282ed587f4d8e1a0e8" Feb 26 16:40:18 crc kubenswrapper[4907]: I0226 16:40:18.530674 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:40:18 crc kubenswrapper[4907]: I0226 16:40:18.531317 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:40:18 crc kubenswrapper[4907]: I0226 16:40:18.531386 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:40:18 crc kubenswrapper[4907]: I0226 16:40:18.532167 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d43bc521831c88457b494ef539cb7ec24221ab1999bc5d1f490d67f1fd00bc95"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:40:18 crc kubenswrapper[4907]: I0226 16:40:18.532213 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://d43bc521831c88457b494ef539cb7ec24221ab1999bc5d1f490d67f1fd00bc95" gracePeriod=600 Feb 26 16:40:19 crc kubenswrapper[4907]: I0226 16:40:19.650537 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="d43bc521831c88457b494ef539cb7ec24221ab1999bc5d1f490d67f1fd00bc95" exitCode=0 Feb 26 16:40:19 crc kubenswrapper[4907]: I0226 16:40:19.650629 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"d43bc521831c88457b494ef539cb7ec24221ab1999bc5d1f490d67f1fd00bc95"} Feb 26 16:40:19 crc kubenswrapper[4907]: I0226 16:40:19.652084 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerStarted","Data":"c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a"} Feb 26 16:40:19 crc kubenswrapper[4907]: I0226 16:40:19.652129 4907 scope.go:117] "RemoveContainer" containerID="6906cab653cd658cba31211ccc435500afa0d86f92cee413c3d24942f2acd8bd" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.501368 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gp899"] Feb 26 16:40:51 crc kubenswrapper[4907]: E0226 16:40:51.502485 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e8d410-0df6-4bfb-a852-64cd2431be07" containerName="oc" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.502504 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e8d410-0df6-4bfb-a852-64cd2431be07" containerName="oc" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.502743 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e8d410-0df6-4bfb-a852-64cd2431be07" containerName="oc" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.504343 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.536314 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp899"] Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.656287 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qt27\" (UniqueName: \"kubernetes.io/projected/168c564e-1b9f-4eac-85dd-92665393067b-kube-api-access-9qt27\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.656418 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-utilities\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.656555 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-catalog-content\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.758330 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qt27\" (UniqueName: \"kubernetes.io/projected/168c564e-1b9f-4eac-85dd-92665393067b-kube-api-access-9qt27\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.758393 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-utilities\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.758454 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-catalog-content\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.758920 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-catalog-content\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.758970 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-utilities\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.788073 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qt27\" (UniqueName: \"kubernetes.io/projected/168c564e-1b9f-4eac-85dd-92665393067b-kube-api-access-9qt27\") pod \"redhat-operators-gp899\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:51 crc kubenswrapper[4907]: I0226 16:40:51.838835 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:40:52 crc kubenswrapper[4907]: I0226 16:40:52.008919 4907 generic.go:334] "Generic (PLEG): container finished" podID="14587e07-76d8-408e-af38-0069fdd00ccd" containerID="23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50" exitCode=0 Feb 26 16:40:52 crc kubenswrapper[4907]: I0226 16:40:52.008984 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2htz6/must-gather-jzgwz" event={"ID":"14587e07-76d8-408e-af38-0069fdd00ccd","Type":"ContainerDied","Data":"23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50"} Feb 26 16:40:52 crc kubenswrapper[4907]: I0226 16:40:52.010252 4907 scope.go:117] "RemoveContainer" containerID="23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50" Feb 26 16:40:52 crc kubenswrapper[4907]: I0226 16:40:52.303899 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp899"] Feb 26 16:40:52 crc kubenswrapper[4907]: W0226 16:40:52.318032 4907 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168c564e_1b9f_4eac_85dd_92665393067b.slice/crio-c50099f71210a6c6f7ad7e42fdc1ae6abffaeba7829b04b0d94e001b5c8ad404 WatchSource:0}: Error finding container c50099f71210a6c6f7ad7e42fdc1ae6abffaeba7829b04b0d94e001b5c8ad404: Status 404 returned error can't find the container with id c50099f71210a6c6f7ad7e42fdc1ae6abffaeba7829b04b0d94e001b5c8ad404 Feb 26 16:40:52 crc kubenswrapper[4907]: I0226 16:40:52.701852 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2htz6_must-gather-jzgwz_14587e07-76d8-408e-af38-0069fdd00ccd/gather/0.log" Feb 26 16:40:53 crc kubenswrapper[4907]: I0226 16:40:53.023828 4907 generic.go:334] "Generic (PLEG): container finished" podID="168c564e-1b9f-4eac-85dd-92665393067b" containerID="35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929" exitCode=0 Feb 26 16:40:53 crc kubenswrapper[4907]: I0226 16:40:53.024237 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerDied","Data":"35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929"} Feb 26 16:40:53 crc kubenswrapper[4907]: I0226 16:40:53.024421 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerStarted","Data":"c50099f71210a6c6f7ad7e42fdc1ae6abffaeba7829b04b0d94e001b5c8ad404"} Feb 26 16:40:54 crc kubenswrapper[4907]: I0226 16:40:54.041810 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerStarted","Data":"3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3"} Feb 26 16:40:59 crc kubenswrapper[4907]: I0226 16:40:59.092825 4907 generic.go:334] "Generic (PLEG): container finished" podID="168c564e-1b9f-4eac-85dd-92665393067b" containerID="3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3" exitCode=0 Feb 26 16:40:59 crc kubenswrapper[4907]: I0226 16:40:59.092918 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerDied","Data":"3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3"} Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.105033 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerStarted","Data":"cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043"} Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.136166 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gp899" podStartSLOduration=2.378257492 podStartE2EDuration="9.136134646s" podCreationTimestamp="2026-02-26 16:40:51 +0000 UTC" firstStartedPulling="2026-02-26 16:40:53.026083305 +0000 UTC m=+3515.544645154" lastFinishedPulling="2026-02-26 16:40:59.783960459 +0000 UTC m=+3522.302522308" observedRunningTime="2026-02-26 16:41:00.120463432 +0000 UTC m=+3522.639025281" watchObservedRunningTime="2026-02-26 16:41:00.136134646 +0000 UTC m=+3522.654696505" Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.436632 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-2htz6/must-gather-jzgwz"] Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.437465 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-2htz6/must-gather-jzgwz" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="copy" containerID="cri-o://54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47" gracePeriod=2 Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.445715 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-2htz6/must-gather-jzgwz"] Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.901032 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2htz6_must-gather-jzgwz_14587e07-76d8-408e-af38-0069fdd00ccd/copy/0.log" Feb 26 16:41:00 crc kubenswrapper[4907]: I0226 16:41:00.901704 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.036984 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14587e07-76d8-408e-af38-0069fdd00ccd-must-gather-output\") pod \"14587e07-76d8-408e-af38-0069fdd00ccd\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.037255 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d54qq\" (UniqueName: \"kubernetes.io/projected/14587e07-76d8-408e-af38-0069fdd00ccd-kube-api-access-d54qq\") pod \"14587e07-76d8-408e-af38-0069fdd00ccd\" (UID: \"14587e07-76d8-408e-af38-0069fdd00ccd\") " Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.044020 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14587e07-76d8-408e-af38-0069fdd00ccd-kube-api-access-d54qq" (OuterVolumeSpecName: "kube-api-access-d54qq") pod "14587e07-76d8-408e-af38-0069fdd00ccd" (UID: "14587e07-76d8-408e-af38-0069fdd00ccd"). InnerVolumeSpecName "kube-api-access-d54qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.139132 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d54qq\" (UniqueName: \"kubernetes.io/projected/14587e07-76d8-408e-af38-0069fdd00ccd-kube-api-access-d54qq\") on node \"crc\" DevicePath \"\"" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.150142 4907 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-2htz6_must-gather-jzgwz_14587e07-76d8-408e-af38-0069fdd00ccd/copy/0.log" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.151394 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2htz6/must-gather-jzgwz" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.151477 4907 scope.go:117] "RemoveContainer" containerID="54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.151549 4907 generic.go:334] "Generic (PLEG): container finished" podID="14587e07-76d8-408e-af38-0069fdd00ccd" containerID="54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47" exitCode=143 Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.158988 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14587e07-76d8-408e-af38-0069fdd00ccd-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "14587e07-76d8-408e-af38-0069fdd00ccd" (UID: "14587e07-76d8-408e-af38-0069fdd00ccd"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.178697 4907 scope.go:117] "RemoveContainer" containerID="23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.243793 4907 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/14587e07-76d8-408e-af38-0069fdd00ccd-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.257746 4907 scope.go:117] "RemoveContainer" containerID="54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47" Feb 26 16:41:01 crc kubenswrapper[4907]: E0226 16:41:01.259524 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47\": container with ID starting with 54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47 not found: ID does not exist" containerID="54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.259567 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47"} err="failed to get container status \"54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47\": rpc error: code = NotFound desc = could not find container \"54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47\": container with ID starting with 54bb61436e47992c339c0869e06fe77d77dd30920e0ea1ee737d6be480502f47 not found: ID does not exist" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.259601 4907 scope.go:117] "RemoveContainer" containerID="23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50" Feb 26 16:41:01 crc kubenswrapper[4907]: E0226 16:41:01.259833 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50\": container with ID starting with 23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50 not found: ID does not exist" containerID="23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.259863 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50"} err="failed to get container status \"23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50\": rpc error: code = NotFound desc = could not find container \"23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50\": container with ID starting with 23026d644dbedb1b2cf8781844be69644ad112c6abd092271f8d9ccb92927b50 not found: ID does not exist" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.840137 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:41:01 crc kubenswrapper[4907]: I0226 16:41:01.840185 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:41:02 crc kubenswrapper[4907]: I0226 16:41:02.135114 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" path="/var/lib/kubelet/pods/14587e07-76d8-408e-af38-0069fdd00ccd/volumes" Feb 26 16:41:02 crc kubenswrapper[4907]: I0226 16:41:02.889994 4907 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gp899" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="registry-server" probeResult="failure" output=< Feb 26 16:41:02 crc kubenswrapper[4907]: timeout: failed to connect service ":50051" within 1s Feb 26 16:41:02 crc kubenswrapper[4907]: > Feb 26 16:41:11 crc kubenswrapper[4907]: I0226 16:41:11.924373 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:41:11 crc kubenswrapper[4907]: I0226 16:41:11.977279 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:41:12 crc kubenswrapper[4907]: I0226 16:41:12.036364 4907 scope.go:117] "RemoveContainer" containerID="b779c1853e304ab137b2e007a5a8ec149347f17d993d712d3e49d4ba65136395" Feb 26 16:41:12 crc kubenswrapper[4907]: I0226 16:41:12.162612 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp899"] Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.271004 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gp899" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="registry-server" containerID="cri-o://cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043" gracePeriod=2 Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.758344 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.896969 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-utilities\") pod \"168c564e-1b9f-4eac-85dd-92665393067b\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.897063 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qt27\" (UniqueName: \"kubernetes.io/projected/168c564e-1b9f-4eac-85dd-92665393067b-kube-api-access-9qt27\") pod \"168c564e-1b9f-4eac-85dd-92665393067b\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.897225 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-catalog-content\") pod \"168c564e-1b9f-4eac-85dd-92665393067b\" (UID: \"168c564e-1b9f-4eac-85dd-92665393067b\") " Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.897813 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-utilities" (OuterVolumeSpecName: "utilities") pod "168c564e-1b9f-4eac-85dd-92665393067b" (UID: "168c564e-1b9f-4eac-85dd-92665393067b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:41:13 crc kubenswrapper[4907]: I0226 16:41:13.904194 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/168c564e-1b9f-4eac-85dd-92665393067b-kube-api-access-9qt27" (OuterVolumeSpecName: "kube-api-access-9qt27") pod "168c564e-1b9f-4eac-85dd-92665393067b" (UID: "168c564e-1b9f-4eac-85dd-92665393067b"). InnerVolumeSpecName "kube-api-access-9qt27". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:13.999726 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:13.999764 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qt27\" (UniqueName: \"kubernetes.io/projected/168c564e-1b9f-4eac-85dd-92665393067b-kube-api-access-9qt27\") on node \"crc\" DevicePath \"\"" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.041933 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "168c564e-1b9f-4eac-85dd-92665393067b" (UID: "168c564e-1b9f-4eac-85dd-92665393067b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.100888 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/168c564e-1b9f-4eac-85dd-92665393067b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.285135 4907 generic.go:334] "Generic (PLEG): container finished" podID="168c564e-1b9f-4eac-85dd-92665393067b" containerID="cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043" exitCode=0 Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.285327 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerDied","Data":"cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043"} Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.285412 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp899" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.285497 4907 scope.go:117] "RemoveContainer" containerID="cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.285480 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp899" event={"ID":"168c564e-1b9f-4eac-85dd-92665393067b","Type":"ContainerDied","Data":"c50099f71210a6c6f7ad7e42fdc1ae6abffaeba7829b04b0d94e001b5c8ad404"} Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.309849 4907 scope.go:117] "RemoveContainer" containerID="3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.315887 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp899"] Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.324923 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gp899"] Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.343310 4907 scope.go:117] "RemoveContainer" containerID="35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.389492 4907 scope.go:117] "RemoveContainer" containerID="cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043" Feb 26 16:41:14 crc kubenswrapper[4907]: E0226 16:41:14.390124 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043\": container with ID starting with cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043 not found: ID does not exist" containerID="cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.390196 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043"} err="failed to get container status \"cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043\": rpc error: code = NotFound desc = could not find container \"cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043\": container with ID starting with cc9ab5292858f583dd169a755558984626b64db2869ccfa53d227a2114d3c043 not found: ID does not exist" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.390231 4907 scope.go:117] "RemoveContainer" containerID="3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3" Feb 26 16:41:14 crc kubenswrapper[4907]: E0226 16:41:14.390480 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3\": container with ID starting with 3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3 not found: ID does not exist" containerID="3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.390501 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3"} err="failed to get container status \"3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3\": rpc error: code = NotFound desc = could not find container \"3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3\": container with ID starting with 3a401400beffc545549d1406e24b588e35b0d4d3fb653afbe7b0aed2edfdadd3 not found: ID does not exist" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.390514 4907 scope.go:117] "RemoveContainer" containerID="35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929" Feb 26 16:41:14 crc kubenswrapper[4907]: E0226 16:41:14.390818 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929\": container with ID starting with 35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929 not found: ID does not exist" containerID="35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929" Feb 26 16:41:14 crc kubenswrapper[4907]: I0226 16:41:14.390867 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929"} err="failed to get container status \"35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929\": rpc error: code = NotFound desc = could not find container \"35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929\": container with ID starting with 35be87cbbd391983c7f43d6f3b879fe3791a0479471587598e185e6674fbd929 not found: ID does not exist" Feb 26 16:41:16 crc kubenswrapper[4907]: I0226 16:41:16.143901 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="168c564e-1b9f-4eac-85dd-92665393067b" path="/var/lib/kubelet/pods/168c564e-1b9f-4eac-85dd-92665393067b/volumes" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.539926 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6xkqt"] Feb 26 16:41:57 crc kubenswrapper[4907]: E0226 16:41:57.541229 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="gather" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541259 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="gather" Feb 26 16:41:57 crc kubenswrapper[4907]: E0226 16:41:57.541334 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="copy" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541348 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="copy" Feb 26 16:41:57 crc kubenswrapper[4907]: E0226 16:41:57.541377 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="extract-utilities" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541390 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="extract-utilities" Feb 26 16:41:57 crc kubenswrapper[4907]: E0226 16:41:57.541408 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="registry-server" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541419 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="registry-server" Feb 26 16:41:57 crc kubenswrapper[4907]: E0226 16:41:57.541455 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="extract-content" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541467 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="extract-content" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541811 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="copy" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541828 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="168c564e-1b9f-4eac-85dd-92665393067b" containerName="registry-server" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.541844 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="14587e07-76d8-408e-af38-0069fdd00ccd" containerName="gather" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.545122 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.549085 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfghl\" (UniqueName: \"kubernetes.io/projected/9b796994-1279-4438-b8f7-406b58507038-kube-api-access-bfghl\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.549380 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-utilities\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.549490 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-catalog-content\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.573208 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xkqt"] Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.652638 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-utilities\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.652685 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-catalog-content\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.652753 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfghl\" (UniqueName: \"kubernetes.io/projected/9b796994-1279-4438-b8f7-406b58507038-kube-api-access-bfghl\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.653631 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-utilities\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.653933 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-catalog-content\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.679789 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfghl\" (UniqueName: \"kubernetes.io/projected/9b796994-1279-4438-b8f7-406b58507038-kube-api-access-bfghl\") pod \"redhat-marketplace-6xkqt\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:57 crc kubenswrapper[4907]: I0226 16:41:57.871110 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:41:58 crc kubenswrapper[4907]: I0226 16:41:58.160773 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xkqt"] Feb 26 16:41:58 crc kubenswrapper[4907]: I0226 16:41:58.759996 4907 generic.go:334] "Generic (PLEG): container finished" podID="9b796994-1279-4438-b8f7-406b58507038" containerID="e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa" exitCode=0 Feb 26 16:41:58 crc kubenswrapper[4907]: I0226 16:41:58.760061 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerDied","Data":"e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa"} Feb 26 16:41:58 crc kubenswrapper[4907]: I0226 16:41:58.760356 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerStarted","Data":"215aedf0206f69669a68cb1e314d78ee1c91c7bc320ab7e62fd8a9d4d9ec00e7"} Feb 26 16:41:58 crc kubenswrapper[4907]: I0226 16:41:58.762517 4907 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.176700 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535402-898vl"] Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.178879 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.185652 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.187921 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.188343 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.190342 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535402-898vl"] Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.223947 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6qn\" (UniqueName: \"kubernetes.io/projected/422322a7-b98c-474e-af80-f774837de503-kube-api-access-ld6qn\") pod \"auto-csr-approver-29535402-898vl\" (UID: \"422322a7-b98c-474e-af80-f774837de503\") " pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.325711 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6qn\" (UniqueName: \"kubernetes.io/projected/422322a7-b98c-474e-af80-f774837de503-kube-api-access-ld6qn\") pod \"auto-csr-approver-29535402-898vl\" (UID: \"422322a7-b98c-474e-af80-f774837de503\") " pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.349389 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6qn\" (UniqueName: \"kubernetes.io/projected/422322a7-b98c-474e-af80-f774837de503-kube-api-access-ld6qn\") pod \"auto-csr-approver-29535402-898vl\" (UID: \"422322a7-b98c-474e-af80-f774837de503\") " pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.499489 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:00 crc kubenswrapper[4907]: I0226 16:42:00.778917 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerStarted","Data":"5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf"} Feb 26 16:42:01 crc kubenswrapper[4907]: I0226 16:42:01.032162 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535402-898vl"] Feb 26 16:42:01 crc kubenswrapper[4907]: I0226 16:42:01.787645 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535402-898vl" event={"ID":"422322a7-b98c-474e-af80-f774837de503","Type":"ContainerStarted","Data":"2fbae97e8ff1e109a25304f4b158427e9126512362522b32203294a257fdd0a3"} Feb 26 16:42:01 crc kubenswrapper[4907]: I0226 16:42:01.789885 4907 generic.go:334] "Generic (PLEG): container finished" podID="9b796994-1279-4438-b8f7-406b58507038" containerID="5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf" exitCode=0 Feb 26 16:42:01 crc kubenswrapper[4907]: I0226 16:42:01.789944 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerDied","Data":"5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf"} Feb 26 16:42:02 crc kubenswrapper[4907]: I0226 16:42:02.802544 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerStarted","Data":"cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292"} Feb 26 16:42:02 crc kubenswrapper[4907]: I0226 16:42:02.804365 4907 generic.go:334] "Generic (PLEG): container finished" podID="422322a7-b98c-474e-af80-f774837de503" containerID="9a3e8d1467b01f9bfe3bb524229f5a55e490cdb5ce3cd5f83c561c5a4a8908ac" exitCode=0 Feb 26 16:42:02 crc kubenswrapper[4907]: I0226 16:42:02.804400 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535402-898vl" event={"ID":"422322a7-b98c-474e-af80-f774837de503","Type":"ContainerDied","Data":"9a3e8d1467b01f9bfe3bb524229f5a55e490cdb5ce3cd5f83c561c5a4a8908ac"} Feb 26 16:42:02 crc kubenswrapper[4907]: I0226 16:42:02.850827 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6xkqt" podStartSLOduration=2.431013637 podStartE2EDuration="5.850810558s" podCreationTimestamp="2026-02-26 16:41:57 +0000 UTC" firstStartedPulling="2026-02-26 16:41:58.762132522 +0000 UTC m=+3581.280694411" lastFinishedPulling="2026-02-26 16:42:02.181929493 +0000 UTC m=+3584.700491332" observedRunningTime="2026-02-26 16:42:02.833112766 +0000 UTC m=+3585.351674615" watchObservedRunningTime="2026-02-26 16:42:02.850810558 +0000 UTC m=+3585.369372407" Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.285887 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.408258 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld6qn\" (UniqueName: \"kubernetes.io/projected/422322a7-b98c-474e-af80-f774837de503-kube-api-access-ld6qn\") pod \"422322a7-b98c-474e-af80-f774837de503\" (UID: \"422322a7-b98c-474e-af80-f774837de503\") " Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.414857 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/422322a7-b98c-474e-af80-f774837de503-kube-api-access-ld6qn" (OuterVolumeSpecName: "kube-api-access-ld6qn") pod "422322a7-b98c-474e-af80-f774837de503" (UID: "422322a7-b98c-474e-af80-f774837de503"). InnerVolumeSpecName "kube-api-access-ld6qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.510464 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld6qn\" (UniqueName: \"kubernetes.io/projected/422322a7-b98c-474e-af80-f774837de503-kube-api-access-ld6qn\") on node \"crc\" DevicePath \"\"" Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.830708 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535402-898vl" event={"ID":"422322a7-b98c-474e-af80-f774837de503","Type":"ContainerDied","Data":"2fbae97e8ff1e109a25304f4b158427e9126512362522b32203294a257fdd0a3"} Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.830746 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fbae97e8ff1e109a25304f4b158427e9126512362522b32203294a257fdd0a3" Feb 26 16:42:04 crc kubenswrapper[4907]: I0226 16:42:04.830798 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535402-898vl" Feb 26 16:42:05 crc kubenswrapper[4907]: I0226 16:42:05.369513 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-br7bg"] Feb 26 16:42:05 crc kubenswrapper[4907]: I0226 16:42:05.380743 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535396-br7bg"] Feb 26 16:42:06 crc kubenswrapper[4907]: I0226 16:42:06.137905 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c80b0c5-c510-48cf-937a-3a9f11285427" path="/var/lib/kubelet/pods/6c80b0c5-c510-48cf-937a-3a9f11285427/volumes" Feb 26 16:42:07 crc kubenswrapper[4907]: I0226 16:42:07.871230 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:42:07 crc kubenswrapper[4907]: I0226 16:42:07.871647 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:42:07 crc kubenswrapper[4907]: I0226 16:42:07.941027 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:42:08 crc kubenswrapper[4907]: I0226 16:42:08.955337 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:42:09 crc kubenswrapper[4907]: I0226 16:42:09.014952 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xkqt"] Feb 26 16:42:10 crc kubenswrapper[4907]: I0226 16:42:10.891575 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6xkqt" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="registry-server" containerID="cri-o://cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292" gracePeriod=2 Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.370203 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.460903 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-catalog-content\") pod \"9b796994-1279-4438-b8f7-406b58507038\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.461067 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-utilities\") pod \"9b796994-1279-4438-b8f7-406b58507038\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.461896 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-utilities" (OuterVolumeSpecName: "utilities") pod "9b796994-1279-4438-b8f7-406b58507038" (UID: "9b796994-1279-4438-b8f7-406b58507038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.463990 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfghl\" (UniqueName: \"kubernetes.io/projected/9b796994-1279-4438-b8f7-406b58507038-kube-api-access-bfghl\") pod \"9b796994-1279-4438-b8f7-406b58507038\" (UID: \"9b796994-1279-4438-b8f7-406b58507038\") " Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.465751 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.471959 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b796994-1279-4438-b8f7-406b58507038-kube-api-access-bfghl" (OuterVolumeSpecName: "kube-api-access-bfghl") pod "9b796994-1279-4438-b8f7-406b58507038" (UID: "9b796994-1279-4438-b8f7-406b58507038"). InnerVolumeSpecName "kube-api-access-bfghl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.491023 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b796994-1279-4438-b8f7-406b58507038" (UID: "9b796994-1279-4438-b8f7-406b58507038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.568026 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfghl\" (UniqueName: \"kubernetes.io/projected/9b796994-1279-4438-b8f7-406b58507038-kube-api-access-bfghl\") on node \"crc\" DevicePath \"\"" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.568090 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b796994-1279-4438-b8f7-406b58507038-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.903656 4907 generic.go:334] "Generic (PLEG): container finished" podID="9b796994-1279-4438-b8f7-406b58507038" containerID="cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292" exitCode=0 Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.903738 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerDied","Data":"cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292"} Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.903796 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6xkqt" event={"ID":"9b796994-1279-4438-b8f7-406b58507038","Type":"ContainerDied","Data":"215aedf0206f69669a68cb1e314d78ee1c91c7bc320ab7e62fd8a9d4d9ec00e7"} Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.903820 4907 scope.go:117] "RemoveContainer" containerID="cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.903891 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6xkqt" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.941413 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xkqt"] Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.942900 4907 scope.go:117] "RemoveContainer" containerID="5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf" Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.962528 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6xkqt"] Feb 26 16:42:11 crc kubenswrapper[4907]: I0226 16:42:11.984770 4907 scope.go:117] "RemoveContainer" containerID="e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.019825 4907 scope.go:117] "RemoveContainer" containerID="cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292" Feb 26 16:42:12 crc kubenswrapper[4907]: E0226 16:42:12.020319 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292\": container with ID starting with cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292 not found: ID does not exist" containerID="cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.020344 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292"} err="failed to get container status \"cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292\": rpc error: code = NotFound desc = could not find container \"cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292\": container with ID starting with cd022fc75b97036837daba4a679bc3033991e22e07d31b52c8dc9245e473b292 not found: ID does not exist" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.020363 4907 scope.go:117] "RemoveContainer" containerID="5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf" Feb 26 16:42:12 crc kubenswrapper[4907]: E0226 16:42:12.020798 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf\": container with ID starting with 5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf not found: ID does not exist" containerID="5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.020839 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf"} err="failed to get container status \"5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf\": rpc error: code = NotFound desc = could not find container \"5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf\": container with ID starting with 5d1e544eb00bfa7ccb6dfe7003d336c4b1021c6033683ce218fc8adbc916f7cf not found: ID does not exist" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.020866 4907 scope.go:117] "RemoveContainer" containerID="e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa" Feb 26 16:42:12 crc kubenswrapper[4907]: E0226 16:42:12.021196 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa\": container with ID starting with e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa not found: ID does not exist" containerID="e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.021254 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa"} err="failed to get container status \"e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa\": rpc error: code = NotFound desc = could not find container \"e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa\": container with ID starting with e82336cf271c94acb3595b80df7abee16d0770dd93532b5e80e7c67c6d2626fa not found: ID does not exist" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.127857 4907 scope.go:117] "RemoveContainer" containerID="d0e560d55afcd969bf8b16d45b3e8a2d898583108a265cf0817897c4fede33d3" Feb 26 16:42:12 crc kubenswrapper[4907]: I0226 16:42:12.142772 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b796994-1279-4438-b8f7-406b58507038" path="/var/lib/kubelet/pods/9b796994-1279-4438-b8f7-406b58507038/volumes" Feb 26 16:42:18 crc kubenswrapper[4907]: I0226 16:42:18.530553 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:42:18 crc kubenswrapper[4907]: I0226 16:42:18.531153 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:42:48 crc kubenswrapper[4907]: I0226 16:42:48.530501 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:42:48 crc kubenswrapper[4907]: I0226 16:42:48.531054 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:43:18 crc kubenswrapper[4907]: I0226 16:43:18.530003 4907 patch_prober.go:28] interesting pod/machine-config-daemon-v5ng6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 16:43:18 crc kubenswrapper[4907]: I0226 16:43:18.530536 4907 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 16:43:18 crc kubenswrapper[4907]: I0226 16:43:18.530628 4907 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" Feb 26 16:43:18 crc kubenswrapper[4907]: I0226 16:43:18.531398 4907 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a"} pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 16:43:18 crc kubenswrapper[4907]: I0226 16:43:18.531498 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" containerName="machine-config-daemon" containerID="cri-o://c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" gracePeriod=600 Feb 26 16:43:18 crc kubenswrapper[4907]: E0226 16:43:18.673239 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:43:19 crc kubenswrapper[4907]: I0226 16:43:19.657272 4907 generic.go:334] "Generic (PLEG): container finished" podID="917eebf3-db36-47b8-af0a-b80d042fddab" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" exitCode=0 Feb 26 16:43:19 crc kubenswrapper[4907]: I0226 16:43:19.657338 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" event={"ID":"917eebf3-db36-47b8-af0a-b80d042fddab","Type":"ContainerDied","Data":"c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a"} Feb 26 16:43:19 crc kubenswrapper[4907]: I0226 16:43:19.657819 4907 scope.go:117] "RemoveContainer" containerID="d43bc521831c88457b494ef539cb7ec24221ab1999bc5d1f490d67f1fd00bc95" Feb 26 16:43:19 crc kubenswrapper[4907]: I0226 16:43:19.659054 4907 scope.go:117] "RemoveContainer" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" Feb 26 16:43:19 crc kubenswrapper[4907]: E0226 16:43:19.659531 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:43:32 crc kubenswrapper[4907]: I0226 16:43:32.127453 4907 scope.go:117] "RemoveContainer" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" Feb 26 16:43:32 crc kubenswrapper[4907]: E0226 16:43:32.128431 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:43:43 crc kubenswrapper[4907]: I0226 16:43:43.127167 4907 scope.go:117] "RemoveContainer" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" Feb 26 16:43:43 crc kubenswrapper[4907]: E0226 16:43:43.130118 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:43:57 crc kubenswrapper[4907]: I0226 16:43:57.126688 4907 scope.go:117] "RemoveContainer" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" Feb 26 16:43:57 crc kubenswrapper[4907]: E0226 16:43:57.127449 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.153372 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535404-mkc88"] Feb 26 16:44:00 crc kubenswrapper[4907]: E0226 16:44:00.154321 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="extract-utilities" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.154338 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="extract-utilities" Feb 26 16:44:00 crc kubenswrapper[4907]: E0226 16:44:00.154388 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="extract-content" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.154402 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="extract-content" Feb 26 16:44:00 crc kubenswrapper[4907]: E0226 16:44:00.154421 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="422322a7-b98c-474e-af80-f774837de503" containerName="oc" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.154428 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="422322a7-b98c-474e-af80-f774837de503" containerName="oc" Feb 26 16:44:00 crc kubenswrapper[4907]: E0226 16:44:00.154458 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="registry-server" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.154465 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="registry-server" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.154928 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b796994-1279-4438-b8f7-406b58507038" containerName="registry-server" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.154984 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="422322a7-b98c-474e-af80-f774837de503" containerName="oc" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.155927 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.160212 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.160568 4907 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.160615 4907 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-n2mrp" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.184147 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535404-mkc88"] Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.219911 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xsg6\" (UniqueName: \"kubernetes.io/projected/c37292c5-c6e1-4614-b7cc-597ab8772d8f-kube-api-access-9xsg6\") pod \"auto-csr-approver-29535404-mkc88\" (UID: \"c37292c5-c6e1-4614-b7cc-597ab8772d8f\") " pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.321321 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xsg6\" (UniqueName: \"kubernetes.io/projected/c37292c5-c6e1-4614-b7cc-597ab8772d8f-kube-api-access-9xsg6\") pod \"auto-csr-approver-29535404-mkc88\" (UID: \"c37292c5-c6e1-4614-b7cc-597ab8772d8f\") " pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.351241 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xsg6\" (UniqueName: \"kubernetes.io/projected/c37292c5-c6e1-4614-b7cc-597ab8772d8f-kube-api-access-9xsg6\") pod \"auto-csr-approver-29535404-mkc88\" (UID: \"c37292c5-c6e1-4614-b7cc-597ab8772d8f\") " pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.482298 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:00 crc kubenswrapper[4907]: I0226 16:44:00.991058 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535404-mkc88"] Feb 26 16:44:01 crc kubenswrapper[4907]: I0226 16:44:01.079501 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535404-mkc88" event={"ID":"c37292c5-c6e1-4614-b7cc-597ab8772d8f","Type":"ContainerStarted","Data":"1517356ddd336c00ee6c34879716009ccbd01caaf8870f8850d8cc1c2f5732fe"} Feb 26 16:44:03 crc kubenswrapper[4907]: I0226 16:44:03.098814 4907 generic.go:334] "Generic (PLEG): container finished" podID="c37292c5-c6e1-4614-b7cc-597ab8772d8f" containerID="fe367b2eef64d76c74e0bfb540451480e7e256863cbdae754bc5c454909f97a4" exitCode=0 Feb 26 16:44:03 crc kubenswrapper[4907]: I0226 16:44:03.098902 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535404-mkc88" event={"ID":"c37292c5-c6e1-4614-b7cc-597ab8772d8f","Type":"ContainerDied","Data":"fe367b2eef64d76c74e0bfb540451480e7e256863cbdae754bc5c454909f97a4"} Feb 26 16:44:04 crc kubenswrapper[4907]: I0226 16:44:04.464059 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:04 crc kubenswrapper[4907]: I0226 16:44:04.496550 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xsg6\" (UniqueName: \"kubernetes.io/projected/c37292c5-c6e1-4614-b7cc-597ab8772d8f-kube-api-access-9xsg6\") pod \"c37292c5-c6e1-4614-b7cc-597ab8772d8f\" (UID: \"c37292c5-c6e1-4614-b7cc-597ab8772d8f\") " Feb 26 16:44:04 crc kubenswrapper[4907]: I0226 16:44:04.502860 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37292c5-c6e1-4614-b7cc-597ab8772d8f-kube-api-access-9xsg6" (OuterVolumeSpecName: "kube-api-access-9xsg6") pod "c37292c5-c6e1-4614-b7cc-597ab8772d8f" (UID: "c37292c5-c6e1-4614-b7cc-597ab8772d8f"). InnerVolumeSpecName "kube-api-access-9xsg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:44:04 crc kubenswrapper[4907]: I0226 16:44:04.600503 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xsg6\" (UniqueName: \"kubernetes.io/projected/c37292c5-c6e1-4614-b7cc-597ab8772d8f-kube-api-access-9xsg6\") on node \"crc\" DevicePath \"\"" Feb 26 16:44:05 crc kubenswrapper[4907]: I0226 16:44:05.120006 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535404-mkc88" event={"ID":"c37292c5-c6e1-4614-b7cc-597ab8772d8f","Type":"ContainerDied","Data":"1517356ddd336c00ee6c34879716009ccbd01caaf8870f8850d8cc1c2f5732fe"} Feb 26 16:44:05 crc kubenswrapper[4907]: I0226 16:44:05.120352 4907 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1517356ddd336c00ee6c34879716009ccbd01caaf8870f8850d8cc1c2f5732fe" Feb 26 16:44:05 crc kubenswrapper[4907]: I0226 16:44:05.120053 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535404-mkc88" Feb 26 16:44:05 crc kubenswrapper[4907]: I0226 16:44:05.544000 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-ddr4l"] Feb 26 16:44:05 crc kubenswrapper[4907]: I0226 16:44:05.554894 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535398-ddr4l"] Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.147883 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebfcda1d-29af-401c-89dc-e06545c4f276" path="/var/lib/kubelet/pods/ebfcda1d-29af-401c-89dc-e06545c4f276/volumes" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.237918 4907 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kgfwc"] Feb 26 16:44:06 crc kubenswrapper[4907]: E0226 16:44:06.238418 4907 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37292c5-c6e1-4614-b7cc-597ab8772d8f" containerName="oc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.238438 4907 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37292c5-c6e1-4614-b7cc-597ab8772d8f" containerName="oc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.238755 4907 memory_manager.go:354] "RemoveStaleState removing state" podUID="c37292c5-c6e1-4614-b7cc-597ab8772d8f" containerName="oc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.240501 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.262579 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgfwc"] Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.333107 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-catalog-content\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.333404 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-utilities\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.333529 4907 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpb6\" (UniqueName: \"kubernetes.io/projected/d7929b61-0e4d-4630-b9ad-ba72efe731ce-kube-api-access-xqpb6\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.435272 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-catalog-content\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.435377 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-utilities\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.435410 4907 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpb6\" (UniqueName: \"kubernetes.io/projected/d7929b61-0e4d-4630-b9ad-ba72efe731ce-kube-api-access-xqpb6\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.436127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-catalog-content\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.436127 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-utilities\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.454311 4907 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpb6\" (UniqueName: \"kubernetes.io/projected/d7929b61-0e4d-4630-b9ad-ba72efe731ce-kube-api-access-xqpb6\") pod \"certified-operators-kgfwc\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:06 crc kubenswrapper[4907]: I0226 16:44:06.559362 4907 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:07 crc kubenswrapper[4907]: I0226 16:44:07.209442 4907 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kgfwc"] Feb 26 16:44:08 crc kubenswrapper[4907]: I0226 16:44:08.166935 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7929b61-0e4d-4630-b9ad-ba72efe731ce" containerID="07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d" exitCode=0 Feb 26 16:44:08 crc kubenswrapper[4907]: I0226 16:44:08.167006 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerDied","Data":"07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d"} Feb 26 16:44:08 crc kubenswrapper[4907]: I0226 16:44:08.167249 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerStarted","Data":"c9d36be0aca572be77396934185f3621d855f8013fa3f862314e4a676d45c248"} Feb 26 16:44:09 crc kubenswrapper[4907]: I0226 16:44:09.180178 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerStarted","Data":"fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5"} Feb 26 16:44:10 crc kubenswrapper[4907]: I0226 16:44:10.126381 4907 scope.go:117] "RemoveContainer" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" Feb 26 16:44:10 crc kubenswrapper[4907]: E0226 16:44:10.127118 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab" Feb 26 16:44:11 crc kubenswrapper[4907]: I0226 16:44:11.201011 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7929b61-0e4d-4630-b9ad-ba72efe731ce" containerID="fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5" exitCode=0 Feb 26 16:44:11 crc kubenswrapper[4907]: I0226 16:44:11.201056 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerDied","Data":"fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5"} Feb 26 16:44:12 crc kubenswrapper[4907]: I0226 16:44:12.220382 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerStarted","Data":"b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5"} Feb 26 16:44:12 crc kubenswrapper[4907]: I0226 16:44:12.244913 4907 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kgfwc" podStartSLOduration=2.7860328450000003 podStartE2EDuration="6.24489572s" podCreationTimestamp="2026-02-26 16:44:06 +0000 UTC" firstStartedPulling="2026-02-26 16:44:08.170110703 +0000 UTC m=+3710.688672562" lastFinishedPulling="2026-02-26 16:44:11.628973578 +0000 UTC m=+3714.147535437" observedRunningTime="2026-02-26 16:44:12.236062004 +0000 UTC m=+3714.754623853" watchObservedRunningTime="2026-02-26 16:44:12.24489572 +0000 UTC m=+3714.763457569" Feb 26 16:44:12 crc kubenswrapper[4907]: I0226 16:44:12.299274 4907 scope.go:117] "RemoveContainer" containerID="e17a64c7afd1f8e9c2aa95314db83af3c5c6827a9651cef30f5e4b7ef032b28a" Feb 26 16:44:16 crc kubenswrapper[4907]: I0226 16:44:16.559516 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:16 crc kubenswrapper[4907]: I0226 16:44:16.560286 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:16 crc kubenswrapper[4907]: I0226 16:44:16.628512 4907 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:17 crc kubenswrapper[4907]: I0226 16:44:17.338332 4907 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:17 crc kubenswrapper[4907]: I0226 16:44:17.404308 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgfwc"] Feb 26 16:44:19 crc kubenswrapper[4907]: I0226 16:44:19.283901 4907 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kgfwc" podUID="d7929b61-0e4d-4630-b9ad-ba72efe731ce" containerName="registry-server" containerID="cri-o://b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5" gracePeriod=2 Feb 26 16:44:19 crc kubenswrapper[4907]: I0226 16:44:19.826898 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:19 crc kubenswrapper[4907]: I0226 16:44:19.994329 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqpb6\" (UniqueName: \"kubernetes.io/projected/d7929b61-0e4d-4630-b9ad-ba72efe731ce-kube-api-access-xqpb6\") pod \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " Feb 26 16:44:19 crc kubenswrapper[4907]: I0226 16:44:19.995080 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-utilities\") pod \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " Feb 26 16:44:19 crc kubenswrapper[4907]: I0226 16:44:19.995280 4907 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-catalog-content\") pod \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\" (UID: \"d7929b61-0e4d-4630-b9ad-ba72efe731ce\") " Feb 26 16:44:19 crc kubenswrapper[4907]: I0226 16:44:19.996389 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-utilities" (OuterVolumeSpecName: "utilities") pod "d7929b61-0e4d-4630-b9ad-ba72efe731ce" (UID: "d7929b61-0e4d-4630-b9ad-ba72efe731ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.005779 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7929b61-0e4d-4630-b9ad-ba72efe731ce-kube-api-access-xqpb6" (OuterVolumeSpecName: "kube-api-access-xqpb6") pod "d7929b61-0e4d-4630-b9ad-ba72efe731ce" (UID: "d7929b61-0e4d-4630-b9ad-ba72efe731ce"). InnerVolumeSpecName "kube-api-access-xqpb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.097189 4907 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqpb6\" (UniqueName: \"kubernetes.io/projected/d7929b61-0e4d-4630-b9ad-ba72efe731ce-kube-api-access-xqpb6\") on node \"crc\" DevicePath \"\"" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.097222 4907 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.296799 4907 generic.go:334] "Generic (PLEG): container finished" podID="d7929b61-0e4d-4630-b9ad-ba72efe731ce" containerID="b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5" exitCode=0 Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.296838 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerDied","Data":"b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5"} Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.296883 4907 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kgfwc" event={"ID":"d7929b61-0e4d-4630-b9ad-ba72efe731ce","Type":"ContainerDied","Data":"c9d36be0aca572be77396934185f3621d855f8013fa3f862314e4a676d45c248"} Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.296900 4907 scope.go:117] "RemoveContainer" containerID="b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.296943 4907 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kgfwc" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.313091 4907 scope.go:117] "RemoveContainer" containerID="fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.334385 4907 scope.go:117] "RemoveContainer" containerID="07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.391499 4907 scope.go:117] "RemoveContainer" containerID="b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5" Feb 26 16:44:20 crc kubenswrapper[4907]: E0226 16:44:20.392241 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5\": container with ID starting with b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5 not found: ID does not exist" containerID="b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.392318 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5"} err="failed to get container status \"b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5\": rpc error: code = NotFound desc = could not find container \"b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5\": container with ID starting with b038a27bea37384411db4dac6d5777f16fa62a260ad018b22938c0778ac24da5 not found: ID does not exist" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.392378 4907 scope.go:117] "RemoveContainer" containerID="fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5" Feb 26 16:44:20 crc kubenswrapper[4907]: E0226 16:44:20.392840 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5\": container with ID starting with fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5 not found: ID does not exist" containerID="fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.392964 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5"} err="failed to get container status \"fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5\": rpc error: code = NotFound desc = could not find container \"fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5\": container with ID starting with fb7c962c9ba7981f9036d4532fde2ec0323a7c00eb0a21980af1a5ad496a67b5 not found: ID does not exist" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.393052 4907 scope.go:117] "RemoveContainer" containerID="07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d" Feb 26 16:44:20 crc kubenswrapper[4907]: E0226 16:44:20.393493 4907 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d\": container with ID starting with 07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d not found: ID does not exist" containerID="07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.393557 4907 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d"} err="failed to get container status \"07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d\": rpc error: code = NotFound desc = could not find container \"07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d\": container with ID starting with 07a7b4938fe174dd0b5d3b1f57cb8dae8fc1bbff32daed9b2fbb7fc27a867d0d not found: ID does not exist" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.680381 4907 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7929b61-0e4d-4630-b9ad-ba72efe731ce" (UID: "d7929b61-0e4d-4630-b9ad-ba72efe731ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.708735 4907 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7929b61-0e4d-4630-b9ad-ba72efe731ce-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.961282 4907 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kgfwc"] Feb 26 16:44:20 crc kubenswrapper[4907]: I0226 16:44:20.970982 4907 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kgfwc"] Feb 26 16:44:22 crc kubenswrapper[4907]: I0226 16:44:22.137565 4907 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7929b61-0e4d-4630-b9ad-ba72efe731ce" path="/var/lib/kubelet/pods/d7929b61-0e4d-4630-b9ad-ba72efe731ce/volumes" Feb 26 16:44:24 crc kubenswrapper[4907]: I0226 16:44:24.127210 4907 scope.go:117] "RemoveContainer" containerID="c201b3fb3895b0bfc9cdda941aa1f3c52b6fbb96803a4d421f98d6a3ca715e3a" Feb 26 16:44:24 crc kubenswrapper[4907]: E0226 16:44:24.127823 4907 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-v5ng6_openshift-machine-config-operator(917eebf3-db36-47b8-af0a-b80d042fddab)\"" pod="openshift-machine-config-operator/machine-config-daemon-v5ng6" podUID="917eebf3-db36-47b8-af0a-b80d042fddab"